Apr 16 18:15:29.796349 ip-10-0-135-125 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:15:30.236100 ip-10-0-135-125 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:15:30.236100 ip-10-0-135-125 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:15:30.236100 ip-10-0-135-125 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:15:30.236100 ip-10-0-135-125 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:15:30.236100 ip-10-0-135-125 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:15:30.237773 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.237688 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:15:30.241477 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241462 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:30.241477 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241477 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241480 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241483 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241486 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241489 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241492 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241496 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241499 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241502 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241505 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241507 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241510 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241512 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241515 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241518 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241520 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241524 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241527 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241530 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241532 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:30.241541 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241535 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241539 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241543 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241546 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241550 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241553 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241556 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241559 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241561 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241564 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241567 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241569 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241572 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241574 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241577 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241579 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241582 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241585 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241587 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:30.242054 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241590 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241592 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241595 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241598 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241603 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241607 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241610 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241612 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241616 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241618 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241621 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241624 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241626 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241629 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241632 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241636 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241639 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241642 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241644 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:30.242532 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241647 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241650 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241653 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241655 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241658 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241661 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241663 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241666 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241669 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241671 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241674 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241676 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241679 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241681 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241684 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241688 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241692 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241694 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241697 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241700 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:30.242991 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241702 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241705 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241708 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241710 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241713 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241715 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.241718 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242112 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242118 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242121 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242124 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242127 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242130 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242132 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242135 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242137 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242140 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242142 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242145 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242149 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:30.243486 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242151 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242154 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242156 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242159 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242161 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242164 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242166 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242169 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242171 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242174 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242177 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242179 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242182 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242184 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242187 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242189 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242192 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242194 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242197 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242200 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:30.243954 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242203 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242205 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242208 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242211 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242213 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242216 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242218 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242221 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242224 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242226 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242229 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242231 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242240 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242243 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242245 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242248 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242251 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242254 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242256 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242259 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:30.244452 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242261 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242265 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242287 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242290 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242293 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242296 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242298 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242301 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242304 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242306 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242309 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242312 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242315 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242318 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242320 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242323 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242325 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242328 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242330 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242334 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:30.244938 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242336 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242339 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242341 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242343 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242346 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242356 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242359 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242361 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242364 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242366 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242369 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242371 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.242375 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244018 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244026 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244032 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244037 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244041 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244044 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244049 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:15:30.245446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244058 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244061 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244064 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244068 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244087 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244091 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244094 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244097 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244100 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244103 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244105 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244108 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244113 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244116 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244119 2573 flags.go:64] FLAG: --config-dir="" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244122 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244125 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244129 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244133 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244136 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244139 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244142 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244145 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244148 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244151 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:15:30.245924 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244154 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244158 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244161 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244164 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244167 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244170 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244173 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244177 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244180 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244183 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244187 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244192 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244196 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244199 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244202 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244205 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244208 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244211 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244214 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244217 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244220 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244223 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244226 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244230 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244233 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:15:30.246538 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244236 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244249 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244253 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244256 2573 flags.go:64] FLAG: --help="false" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244259 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244262 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244265 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244268 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244272 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244275 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244278 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244281 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244283 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244286 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244289 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244292 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244295 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244299 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244303 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244306 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244309 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244312 2573 flags.go:64] FLAG: --lock-file="" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244315 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244319 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:15:30.247175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244322 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244331 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244334 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244337 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244340 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244343 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244346 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244349 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244352 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244356 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244365 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244369 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244373 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244375 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244387 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244390 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244393 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244396 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244399 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244406 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244410 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244413 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244416 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:15:30.247762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244419 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244424 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244428 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244431 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244435 2573 flags.go:64] FLAG: --port="10250" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244438 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244441 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e954e5fc6784716c" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244444 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244447 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244450 2573 flags.go:64] FLAG: --register-node="true" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244453 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244456 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244460 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244463 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244466 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244469 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244473 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244476 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244479 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244482 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244490 2573 flags.go:64] FLAG: --runonce="false" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244493 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244496 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244500 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244503 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244505 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:15:30.248358 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244508 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244512 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244515 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244519 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244522 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244525 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244527 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244531 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244534 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244536 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244543 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244546 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244549 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244556 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244558 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244561 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244564 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244567 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244570 2573 flags.go:64] FLAG: --v="2" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244575 2573 flags.go:64] FLAG: --version="false" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244578 2573 flags.go:64] FLAG: --vmodule="" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244582 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.244586 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244694 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:30.248982 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244699 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244702 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244705 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244708 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244712 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244715 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244717 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244720 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244722 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244725 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244728 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244730 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244733 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244736 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244738 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244741 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244743 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244746 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244749 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244752 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:30.249611 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244755 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244757 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244760 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244763 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244766 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244768 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244771 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244774 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244777 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244779 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244784 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244787 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244789 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244792 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244795 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244797 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244808 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244811 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244813 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:30.250157 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244816 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244818 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244821 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244823 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244826 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244828 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244831 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244833 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244836 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244839 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244841 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244845 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244848 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244850 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244853 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244855 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244858 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244861 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244863 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244867 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:30.250621 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244870 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244873 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244876 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244879 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244882 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244885 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244888 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244891 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244893 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244898 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244908 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244910 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244913 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244915 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244918 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244920 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244923 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244926 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244929 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244931 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:30.251143 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244934 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:30.251646 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244937 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:30.251646 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244939 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:30.251646 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244943 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:30.251646 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244946 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:30.251646 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.244948 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:30.251646 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.245835 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:15:30.252899 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.252879 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:15:30.252933 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.252900 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:15:30.252965 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252949 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:30.252965 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252955 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:30.252965 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252958 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:30.252965 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252961 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:30.252965 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252965 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:30.252965 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252968 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252971 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252974 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252977 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252980 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252983 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252986 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252989 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252991 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252994 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252996 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.252999 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253002 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253005 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253008 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253010 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253013 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253016 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253018 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253021 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:30.253130 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253023 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253026 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253029 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253032 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253037 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253040 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253043 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253046 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253048 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253051 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253054 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253056 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253059 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253061 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253065 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253067 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253070 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253088 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253092 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:30.253634 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253095 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253098 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253101 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253103 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253106 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253109 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253112 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253115 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253118 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253122 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253126 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253129 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253132 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253134 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253137 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253140 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253143 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253146 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253149 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:30.254115 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253151 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253155 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253157 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253160 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253163 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253167 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253169 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253172 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253175 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253178 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253180 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253183 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253185 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253188 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253191 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253193 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253196 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253198 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253202 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253204 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:30.254613 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253207 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253209 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253212 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.253217 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253332 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253338 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253341 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253344 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253347 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253349 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253352 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253355 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253358 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253360 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253368 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253371 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:15:30.255123 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253374 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253377 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253379 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253382 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253384 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253387 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253390 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253392 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253405 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253408 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253411 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253414 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253416 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253419 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253421 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253424 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253426 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253429 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253432 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253436 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:15:30.255521 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253439 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253441 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253444 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253447 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253449 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253452 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253454 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253457 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253459 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253462 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253464 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253474 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253477 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253480 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253483 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253485 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253488 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253491 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253493 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:15:30.255998 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253496 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253499 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253502 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253504 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253507 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253509 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253513 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253516 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253519 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253522 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253525 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253527 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253530 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253532 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253535 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253538 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253540 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253543 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253545 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:15:30.256474 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253548 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253550 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253552 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253555 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253558 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253560 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253568 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253571 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253574 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253576 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253579 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253582 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253585 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253588 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253590 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:30.253593 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:15:30.256934 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.253598 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:15:30.257332 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.254324 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:15:30.257332 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.256520 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:15:30.257553 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.257541 2573 server.go:1019] "Starting client certificate rotation" Apr 16 18:15:30.257653 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.257637 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:15:30.257686 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.257673 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:15:30.282548 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.282527 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:15:30.288545 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.288519 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:15:30.306629 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.306606 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:15:30.312314 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.312298 2573 log.go:25] "Validated CRI v1 image API" Apr 16 18:15:30.313671 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.313657 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:15:30.316270 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.316251 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b5e000ea-f5ad-473a-8a87-64d23005c6ed:/dev/nvme0n1p3 e9ea2cd3-5f53-4fed-bb0c-169945f5d9eb:/dev/nvme0n1p4] Apr 16 18:15:30.316315 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.316271 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:15:30.321898 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.321797 2573 manager.go:217] Machine: {Timestamp:2026-04-16 18:15:30.319938195 +0000 UTC m=+0.408487616 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3086826 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20b3fe19a0f0b1f0716f026c53cf6c SystemUUID:ec20b3fe-19a0-f0b1-f071-6f026c53cf6c BootID:933e1346-516e-4d06-85f2-cb37ba0ecbe7 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cc:7b:2f:85:31 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cc:7b:2f:85:31 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:80:2f:b6:c5:11 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:15:30.321898 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.321894 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:15:30.322037 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.321967 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:15:30.324341 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.324320 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:15:30.324495 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.324343 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-125.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:15:30.324542 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.324504 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:15:30.324542 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.324513 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:15:30.324542 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.324530 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:15:30.325272 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.325262 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:15:30.326936 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.326927 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:15:30.327040 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.327032 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:15:30.330194 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.330185 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:15:30.330231 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.330198 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:15:30.330231 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.330209 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:15:30.330231 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.330218 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:15:30.330231 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.330227 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:15:30.331227 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.331215 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:15:30.331264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.331240 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:15:30.334283 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.334254 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:15:30.334475 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.334456 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:15:30.336748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.336735 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:15:30.339453 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339418 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339459 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339470 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339480 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339489 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339505 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339515 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:15:30.339547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339523 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:15:30.339830 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339549 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:15:30.339830 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339572 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:15:30.339830 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339599 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:15:30.339830 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.339613 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:15:30.340582 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.340565 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:15:30.340616 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.340591 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:15:30.344747 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.344733 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:15:30.344822 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.344776 2573 server.go:1295] "Started kubelet" Apr 16 18:15:30.344873 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.344853 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:15:30.344976 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.344926 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:15:30.345031 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.345001 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:15:30.345345 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.345323 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-125.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:15:30.345345 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.345325 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:15:30.345466 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.345362 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-125.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:15:30.345627 ip-10-0-135-125 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:15:30.347397 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.347384 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:15:30.348666 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.348649 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:15:30.355332 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.354362 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-125.ec2.internal.18a6e909913a81b1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-125.ec2.internal,UID:ip-10-0-135-125.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-125.ec2.internal,},FirstTimestamp:2026-04-16 18:15:30.344747441 +0000 UTC m=+0.433296845,LastTimestamp:2026-04-16 18:15:30.344747441 +0000 UTC m=+0.433296845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-125.ec2.internal,}" Apr 16 18:15:30.355504 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.355490 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:15:30.355746 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.355719 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:15:30.356205 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.356186 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:15:30.356760 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.356743 2573 factory.go:55] Registering systemd factory Apr 16 18:15:30.356760 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.356761 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:15:30.356962 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.356948 2573 factory.go:153] Registering CRI-O factory Apr 16 18:15:30.357019 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.356966 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 18:15:30.357120 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357029 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:15:30.357120 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.357053 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.357120 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357097 2573 factory.go:103] Registering Raw factory Apr 16 18:15:30.357120 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357115 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 18:15:30.357310 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357113 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:15:30.357310 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357172 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:15:30.357310 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357099 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:15:30.357310 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357304 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:15:30.357508 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357317 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:15:30.357574 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.357560 2573 manager.go:319] Starting recovery of all containers Apr 16 18:15:30.361959 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.361778 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-125.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:15:30.362034 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.361781 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:15:30.368878 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.368863 2573 manager.go:324] Recovery completed Apr 16 18:15:30.374276 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.374260 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:30.379130 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.379106 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:30.379201 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.379142 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:30.379201 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.379152 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:30.379640 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.379627 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:15:30.379640 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.379639 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:15:30.379711 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.379656 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:15:30.382935 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.381053 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-125.ec2.internal.18a6e90993472329 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-125.ec2.internal,UID:ip-10-0-135-125.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-125.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-125.ec2.internal,},FirstTimestamp:2026-04-16 18:15:30.379129641 +0000 UTC m=+0.467679047,LastTimestamp:2026-04-16 18:15:30.379129641 +0000 UTC m=+0.467679047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-125.ec2.internal,}" Apr 16 18:15:30.383212 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.383201 2573 policy_none.go:49] "None policy: Start" Apr 16 18:15:30.383261 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.383216 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:15:30.383261 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.383227 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:15:30.397681 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.397622 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-125.ec2.internal.18a6e9099347675e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-125.ec2.internal,UID:ip-10-0-135-125.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-125.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-125.ec2.internal,},FirstTimestamp:2026-04-16 18:15:30.379147102 +0000 UTC m=+0.467696508,LastTimestamp:2026-04-16 18:15:30.379147102 +0000 UTC m=+0.467696508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-125.ec2.internal,}" Apr 16 18:15:30.415605 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.415588 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-knvjv" Apr 16 18:15:30.415718 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.415656 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-125.ec2.internal.18a6e90993478b63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-125.ec2.internal,UID:ip-10-0-135-125.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-135-125.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-135-125.ec2.internal,},FirstTimestamp:2026-04-16 18:15:30.379156323 +0000 UTC m=+0.467705730,LastTimestamp:2026-04-16 18:15:30.379156323 +0000 UTC m=+0.467705730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-125.ec2.internal,}" Apr 16 18:15:30.425038 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.425019 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-knvjv" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.430642 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.430666 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.430676 2573 server.go:85] "Starting device plugin registration server" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.430880 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.430891 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.430977 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.431060 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.431069 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.431584 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:15:30.444327 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.431613 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.497843 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.497793 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:15:30.498931 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.498917 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:15:30.499017 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.498941 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:15:30.499017 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.498960 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:15:30.499017 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.498966 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:15:30.499017 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.499000 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:15:30.501625 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.501609 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:30.531925 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.531910 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:30.532769 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.532750 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:30.532851 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.532777 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:30.532851 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.532787 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:30.532851 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.532806 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.547860 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.547839 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.547860 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.547859 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-125.ec2.internal\": node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.592627 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.592608 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.599556 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.599528 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal"] Apr 16 18:15:30.599633 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.599589 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:30.600304 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.600285 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:30.600378 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.600314 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:30.600378 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.600324 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:30.601802 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.601790 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:30.601935 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.601922 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.601975 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.601950 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:30.602469 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.602443 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:30.602469 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.602450 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:30.602469 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.602469 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:30.602660 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.602480 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:30.602660 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.602488 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:30.602660 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.602505 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:30.604631 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.604614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.604711 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.604642 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:15:30.605283 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.605253 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:15:30.605283 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.605277 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:15:30.605392 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.605289 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:15:30.618593 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.618576 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-125.ec2.internal\" not found" node="ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.621941 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.621924 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-125.ec2.internal\" not found" node="ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.658597 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.658573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1df915e3f3dcc8bf40de1ae187fd712-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal\" (UID: \"c1df915e3f3dcc8bf40de1ae187fd712\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.693477 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.693462 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.759315 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.759263 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1df915e3f3dcc8bf40de1ae187fd712-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal\" (UID: \"c1df915e3f3dcc8bf40de1ae187fd712\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.759315 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.759292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/50124a769f014399bcd19764e87e11fb-config\") pod \"kube-apiserver-proxy-ip-10-0-135-125.ec2.internal\" (UID: \"50124a769f014399bcd19764e87e11fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.759430 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.759326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1df915e3f3dcc8bf40de1ae187fd712-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal\" (UID: \"c1df915e3f3dcc8bf40de1ae187fd712\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.759430 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.759365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1df915e3f3dcc8bf40de1ae187fd712-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal\" (UID: \"c1df915e3f3dcc8bf40de1ae187fd712\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.794020 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.794002 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.860166 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.860137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1df915e3f3dcc8bf40de1ae187fd712-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal\" (UID: \"c1df915e3f3dcc8bf40de1ae187fd712\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.860262 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.860193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1df915e3f3dcc8bf40de1ae187fd712-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal\" (UID: \"c1df915e3f3dcc8bf40de1ae187fd712\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.860262 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.860244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/50124a769f014399bcd19764e87e11fb-config\") pod \"kube-apiserver-proxy-ip-10-0-135-125.ec2.internal\" (UID: \"50124a769f014399bcd19764e87e11fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.860337 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.860289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/50124a769f014399bcd19764e87e11fb-config\") pod \"kube-apiserver-proxy-ip-10-0-135-125.ec2.internal\" (UID: \"50124a769f014399bcd19764e87e11fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.895013 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.894993 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:30.923707 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.923689 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.925760 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:30.925744 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" Apr 16 18:15:30.995212 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:30.995188 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:31.096293 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:31.096228 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:31.196774 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:31.196751 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:31.256965 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.256937 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:15:31.257605 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.257067 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:15:31.297430 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:31.297391 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:31.355751 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.355695 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:15:31.381973 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.381946 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:15:31.398151 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:31.398127 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-125.ec2.internal\" not found" Apr 16 18:15:31.420905 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:31.420875 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50124a769f014399bcd19764e87e11fb.slice/crio-d1c05363245164e3f670acac96ad610677af62cdd5767e0418ab10cc5ef1c399 WatchSource:0}: Error finding container d1c05363245164e3f670acac96ad610677af62cdd5767e0418ab10cc5ef1c399: Status 404 returned error can't find the container with id d1c05363245164e3f670acac96ad610677af62cdd5767e0418ab10cc5ef1c399 Apr 16 18:15:31.421107 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:31.421092 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1df915e3f3dcc8bf40de1ae187fd712.slice/crio-2b8d73f818daf9fe5ffd647e8c4fc8aaa126d52296b3e93cfbf723800f595738 WatchSource:0}: Error finding container 2b8d73f818daf9fe5ffd647e8c4fc8aaa126d52296b3e93cfbf723800f595738: Status 404 returned error can't find the container with id 2b8d73f818daf9fe5ffd647e8c4fc8aaa126d52296b3e93cfbf723800f595738 Apr 16 18:15:31.424925 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.424911 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:31.427594 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.427568 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:10:30 +0000 UTC" deadline="2027-10-21 21:01:27.040447711 +0000 UTC" Apr 16 18:15:31.427662 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.427595 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13274h45m55.612856959s" Apr 16 18:15:31.430690 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.430675 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7gvvm" Apr 16 18:15:31.443560 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.443541 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7gvvm" Apr 16 18:15:31.472066 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.472050 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:31.501251 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.501217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" event={"ID":"c1df915e3f3dcc8bf40de1ae187fd712","Type":"ContainerStarted","Data":"2b8d73f818daf9fe5ffd647e8c4fc8aaa126d52296b3e93cfbf723800f595738"} Apr 16 18:15:31.502222 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.502196 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" event={"ID":"50124a769f014399bcd19764e87e11fb","Type":"ContainerStarted","Data":"d1c05363245164e3f670acac96ad610677af62cdd5767e0418ab10cc5ef1c399"} Apr 16 18:15:31.557405 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.557384 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" Apr 16 18:15:31.575407 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.575386 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:15:31.577157 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.577145 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" Apr 16 18:15:31.599900 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.599883 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:15:31.849969 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.849881 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:31.894678 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:31.894643 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:32.160893 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.160781 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:15:32.332650 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.332619 2573 apiserver.go:52] "Watching apiserver" Apr 16 18:15:32.347778 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.347751 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:15:32.348149 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.348127 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj","openshift-dns/node-resolver-82pct","openshift-multus/multus-additional-cni-plugins-xflf7","openshift-multus/network-metrics-daemon-7rqfn","openshift-network-diagnostics/network-check-target-2xsjc","openshift-network-operator/iptables-alerter-cbp5h","openshift-cluster-node-tuning-operator/tuned-jp5k9","openshift-image-registry/node-ca-8694n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal","openshift-multus/multus-dp48w","openshift-ovn-kubernetes/ovnkube-node-25kmv","kube-system/konnectivity-agent-q7ctb"] Apr 16 18:15:32.351937 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.351911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.353377 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.353357 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.354794 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.354774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.354904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.354883 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.356493 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.356473 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.356591 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.356535 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:32.357895 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.357877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.361158 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.361141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.361301 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.361286 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.363030 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.363014 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.363132 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.363016 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.363260 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.363237 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.365035 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.365016 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.365145 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.365042 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.365145 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.365111 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:15:32.365251 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.365164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:32.365251 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.365229 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:32.365541 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.365527 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dpdl8\"" Apr 16 18:15:32.366335 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366307 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:15:32.366416 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366353 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc6t2\" (UniqueName: \"kubernetes.io/projected/1c559edc-905d-4a66-b3b6-e4767670d083-kube-api-access-xc6t2\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.366416 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366383 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-tuned\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366416 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-device-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9125009d-485e-443b-85bb-384f2afc6de2-hosts-file\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysctl-d\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-os-release\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.366608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366594 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-systemd\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-sys\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-registration-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366688 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-modprobe-d\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-etc-selinux\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jzx\" (UniqueName: \"kubernetes.io/projected/9125009d-485e-443b-85bb-384f2afc6de2-kube-api-access-g7jzx\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-system-cni-dir\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-kubernetes\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-var-lib-kubelet\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366823 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvfbz\" (UniqueName: \"kubernetes.io/projected/72016a61-4b3e-4be4-9f3e-5106c032b993-kube-api-access-zvfbz\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cni-binary-copy\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.366904 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgvn\" (UniqueName: \"kubernetes.io/projected/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-kube-api-access-szgvn\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysconfig\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.366980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-lib-modules\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-host\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72016a61-4b3e-4be4-9f3e-5106c032b993-tmp\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-host-slash\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367117 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-run\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367145 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-iptables-alerter-script\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-socket-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367214 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-sys-fs\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367236 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7242h\" (UniqueName: \"kubernetes.io/projected/9f8cd101-1d2b-4e8c-8a93-709cc894653b-kube-api-access-7242h\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9125009d-485e-443b-85bb-384f2afc6de2-tmp-dir\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cnibin\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysctl-conf\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkmrh\" (UniqueName: \"kubernetes.io/projected/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-kube-api-access-fkmrh\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.367528 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.368458 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.367929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.368458 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.368315 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.368458 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.368365 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:15:32.368458 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.368379 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8n2dn\"" Apr 16 18:15:32.369209 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.369192 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.372231 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.372216 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:15:32.373838 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.373820 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:15:32.373838 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.373834 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:15:32.375805 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:15:32.375805 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375744 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.375805 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:15:32.376010 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375836 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q9cpn\"" Apr 16 18:15:32.376010 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375899 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:15:32.376010 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375913 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.376010 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.375914 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8rdb2\"" Apr 16 18:15:32.376366 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.376343 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-72ndk\"" Apr 16 18:15:32.376793 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.376349 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:15:32.376869 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.376793 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:15:32.377487 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.377467 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.377875 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.377820 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:15:32.377875 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.377833 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:15:32.378010 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.377973 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:15:32.378452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.378431 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-r6pjn\"" Apr 16 18:15:32.378723 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.378708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r9x9j\"" Apr 16 18:15:32.378922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.378902 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vcbcz\"" Apr 16 18:15:32.379481 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.379461 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-slcpk\"" Apr 16 18:15:32.379745 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.379730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:15:32.444228 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.444200 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:10:31 +0000 UTC" deadline="2027-12-12 17:48:13.887523958 +0000 UTC" Apr 16 18:15:32.444228 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.444224 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14519h32m41.443301644s" Apr 16 18:15:32.459059 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.459042 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:15:32.467645 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkc4m\" (UniqueName: \"kubernetes.io/projected/44520956-97f4-441e-acae-e1c5b82de2ea-kube-api-access-vkc4m\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.467764 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467653 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:32.467764 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysctl-conf\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.467764 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-kubelet\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.467929 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-etc-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.467929 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-ovn\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.467929 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysctl-conf\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.467929 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-env-overrides\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.467929 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3-agent-certs\") pod \"konnectivity-agent-q7ctb\" (UID: \"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3\") " pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.467929 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.467980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-systemd\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-registration-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-systemd\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jzx\" (UniqueName: \"kubernetes.io/projected/9125009d-485e-443b-85bb-384f2afc6de2-kube-api-access-g7jzx\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.468202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-cni-binary-copy\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-modprobe-d\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-registration-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-k8s-cni-cncf-io\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-netns\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468294 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-hostroot\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-daemon-config\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44520956-97f4-441e-acae-e1c5b82de2ea-host\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-modprobe-d\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-kubernetes\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-slash\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-sys-fs\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-system-cni-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-kubernetes\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-sys-fs\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.468526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szgvn\" (UniqueName: \"kubernetes.io/projected/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-kube-api-access-szgvn\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-lib-modules\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-host\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-host-slash\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovn-node-metrics-cert\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-run\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-lib-modules\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-host\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-iptables-alerter-script\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-host-slash\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-cni-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-os-release\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-cni-bin\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468922 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-run\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468933 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-var-lib-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.469273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.468981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-run-ovn-kubernetes\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cnibin\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkmrh\" (UniqueName: \"kubernetes.io/projected/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-kube-api-access-fkmrh\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469009 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-node-log\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cnibin\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc6t2\" (UniqueName: \"kubernetes.io/projected/1c559edc-905d-4a66-b3b6-e4767670d083-kube-api-access-xc6t2\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-tuned\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-iptables-alerter-script\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-device-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9125009d-485e-443b-85bb-384f2afc6de2-hosts-file\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-conf-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-etc-kubernetes\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-device-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469324 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9125009d-485e-443b-85bb-384f2afc6de2-hosts-file\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.470190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-systemd-units\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-run-netns\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysctl-d\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-log-socket\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-cni-netd\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469474 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3-konnectivity-ca\") pod \"konnectivity-agent-q7ctb\" (UID: \"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3\") " pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469587 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysctl-d\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-os-release\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-os-release\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-sys\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-etc-selinux\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-socket-dir-parent\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.469730 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-sys\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovnkube-config\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.470967 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469780 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovnkube-script-lib\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.469808 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:32.969777966 +0000 UTC m=+3.058327384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-etc-selinux\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhnxz\" (UniqueName: \"kubernetes.io/projected/c5641439-7e2a-42bc-ae08-1777c6dcb692-kube-api-access-jhnxz\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-system-cni-dir\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-var-lib-kubelet\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-system-cni-dir\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvfbz\" (UniqueName: \"kubernetes.io/projected/72016a61-4b3e-4be4-9f3e-5106c032b993-kube-api-access-zvfbz\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-var-lib-kubelet\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.469986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-cnibin\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-cni-multus\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-systemd\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-socket-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7242h\" (UniqueName: \"kubernetes.io/projected/9f8cd101-1d2b-4e8c-8a93-709cc894653b-kube-api-access-7242h\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9125009d-485e-443b-85bb-384f2afc6de2-tmp-dir\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-multus-certs\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.471743 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cni-binary-copy\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9f8cd101-1d2b-4e8c-8a93-709cc894653b-socket-dir\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysconfig\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72016a61-4b3e-4be4-9f3e-5106c032b993-tmp\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-sysconfig\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5dq\" (UniqueName: \"kubernetes.io/projected/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-kube-api-access-hd5dq\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-kubelet\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-cni-bin\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/44520956-97f4-441e-acae-e1c5b82de2ea-serviceca\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9125009d-485e-443b-85bb-384f2afc6de2-tmp-dir\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.472483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.470730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-cni-binary-copy\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.472969 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.472656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72016a61-4b3e-4be4-9f3e-5106c032b993-etc-tuned\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.472969 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.472698 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72016a61-4b3e-4be4-9f3e-5106c032b993-tmp\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.494270 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.494248 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7242h\" (UniqueName: \"kubernetes.io/projected/9f8cd101-1d2b-4e8c-8a93-709cc894653b-kube-api-access-7242h\") pod \"aws-ebs-csi-driver-node-6ktxj\" (UID: \"9f8cd101-1d2b-4e8c-8a93-709cc894653b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.498900 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.498876 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvfbz\" (UniqueName: \"kubernetes.io/projected/72016a61-4b3e-4be4-9f3e-5106c032b993-kube-api-access-zvfbz\") pod \"tuned-jp5k9\" (UID: \"72016a61-4b3e-4be4-9f3e-5106c032b993\") " pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.499122 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.499106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc6t2\" (UniqueName: \"kubernetes.io/projected/1c559edc-905d-4a66-b3b6-e4767670d083-kube-api-access-xc6t2\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.499203 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.499150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgvn\" (UniqueName: \"kubernetes.io/projected/533ef48a-edc5-4453-925d-1c6e4b8c3aa0-kube-api-access-szgvn\") pod \"multus-additional-cni-plugins-xflf7\" (UID: \"533ef48a-edc5-4453-925d-1c6e4b8c3aa0\") " pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.499306 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.499287 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkmrh\" (UniqueName: \"kubernetes.io/projected/6d0dfe5e-9489-4a73-ae93-e266ba6c0e34-kube-api-access-fkmrh\") pod \"iptables-alerter-cbp5h\" (UID: \"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34\") " pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.500131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.500109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jzx\" (UniqueName: \"kubernetes.io/projected/9125009d-485e-443b-85bb-384f2afc6de2-kube-api-access-g7jzx\") pod \"node-resolver-82pct\" (UID: \"9125009d-485e-443b-85bb-384f2afc6de2\") " pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.570770 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-system-cni-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.570922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovn-node-metrics-cert\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.570922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-cni-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.570922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-os-release\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.570922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-cni-bin\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.570922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-var-lib-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571198 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.570995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-cni-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571198 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-os-release\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571198 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-cni-bin\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571198 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571157 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-system-cni-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571198 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-run-ovn-kubernetes\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-node-log\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571259 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-node-log\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-var-lib-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-run-ovn-kubernetes\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-conf-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-etc-kubernetes\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-systemd-units\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-conf-dir\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-systemd-units\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-run-netns\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-etc-kubernetes\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-log-socket\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-run-netns\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-log-socket\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-cni-netd\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3-konnectivity-ca\") pod \"konnectivity-agent-q7ctb\" (UID: \"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3\") " pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-socket-dir-parent\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovnkube-config\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-cni-netd\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.571748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovnkube-script-lib\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhnxz\" (UniqueName: \"kubernetes.io/projected/c5641439-7e2a-42bc-ae08-1777c6dcb692-kube-api-access-jhnxz\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-socket-dir-parent\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-cnibin\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-cni-multus\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-systemd\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571902 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-multus-certs\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.571971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5dq\" (UniqueName: \"kubernetes.io/projected/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-kube-api-access-hd5dq\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-kubelet\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-cni-bin\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572055 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/44520956-97f4-441e-acae-e1c5b82de2ea-serviceca\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkc4m\" (UniqueName: \"kubernetes.io/projected/44520956-97f4-441e-acae-e1c5b82de2ea-kube-api-access-vkc4m\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-kubelet\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-etc-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-ovn\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-env-overrides\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.572350 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3-agent-certs\") pod \"konnectivity-agent-q7ctb\" (UID: \"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3\") " pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572319 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-cni-binary-copy\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-k8s-cni-cncf-io\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-netns\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-hostroot\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovnkube-config\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572470 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-daemon-config\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/44520956-97f4-441e-acae-e1c5b82de2ea-serviceca\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.572988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-multus-daemon-config\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573054 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-systemd\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-multus-certs\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573189 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-cni-multus\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-cnibin\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-var-lib-kubelet\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-netns\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-etc-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-host-run-k8s-cni-cncf-io\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44520956-97f4-441e-acae-e1c5b82de2ea-host\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44520956-97f4-441e-acae-e1c5b82de2ea-host\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-slash\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-env-overrides\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-slash\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.573738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3-konnectivity-ca\") pod \"konnectivity-agent-q7ctb\" (UID: \"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3\") " pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-cni-bin\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-hostroot\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovnkube-script-lib\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-ovn\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.573990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-kubelet\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.574007 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.574018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5641439-7e2a-42bc-ae08-1777c6dcb692-ovn-node-metrics-cert\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.574128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5641439-7e2a-42bc-ae08-1777c6dcb692-run-openvswitch\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.574594 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.574299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-cni-binary-copy\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.576052 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.576032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3-agent-certs\") pod \"konnectivity-agent-q7ctb\" (UID: \"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3\") " pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.585042 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.585018 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:32.585042 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.585042 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:32.585208 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.585056 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:32.585208 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.585151 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:33.085132803 +0000 UTC m=+3.173682200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:32.585480 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.585459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkc4m\" (UniqueName: \"kubernetes.io/projected/44520956-97f4-441e-acae-e1c5b82de2ea-kube-api-access-vkc4m\") pod \"node-ca-8694n\" (UID: \"44520956-97f4-441e-acae-e1c5b82de2ea\") " pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.586574 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.586553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhnxz\" (UniqueName: \"kubernetes.io/projected/c5641439-7e2a-42bc-ae08-1777c6dcb692-kube-api-access-jhnxz\") pod \"ovnkube-node-25kmv\" (UID: \"c5641439-7e2a-42bc-ae08-1777c6dcb692\") " pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.587286 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.587267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5dq\" (UniqueName: \"kubernetes.io/projected/99b3e81e-b20b-4a7d-a324-a26ce7f61f58-kube-api-access-hd5dq\") pod \"multus-dp48w\" (UID: \"99b3e81e-b20b-4a7d-a324-a26ce7f61f58\") " pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.663586 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.663505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cbp5h" Apr 16 18:15:32.671474 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.671155 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" Apr 16 18:15:32.682338 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.682313 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-82pct" Apr 16 18:15:32.687923 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.687904 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xflf7" Apr 16 18:15:32.694429 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.694415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" Apr 16 18:15:32.701941 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.701925 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8694n" Apr 16 18:15:32.709514 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.709496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dp48w" Apr 16 18:15:32.716223 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.716206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:32.721984 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.721966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:32.975876 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:32.975814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:32.975953 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.975939 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:32.976007 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:32.975998 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:33.97598223 +0000 UTC m=+4.064531625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:32.990719 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.990561 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b3e81e_b20b_4a7d_a324_a26ce7f61f58.slice/crio-cf9b3d349987b4fe6806060e902d411f344d5164b2d7cb04b65b2a774701ba56 WatchSource:0}: Error finding container cf9b3d349987b4fe6806060e902d411f344d5164b2d7cb04b65b2a774701ba56: Status 404 returned error can't find the container with id cf9b3d349987b4fe6806060e902d411f344d5164b2d7cb04b65b2a774701ba56 Apr 16 18:15:32.991680 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.991652 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72016a61_4b3e_4be4_9f3e_5106c032b993.slice/crio-c966cbccdbaed7ef30b17cb3922298e3cf8105073aa5f3f9b9f467a9c932d4f3 WatchSource:0}: Error finding container c966cbccdbaed7ef30b17cb3922298e3cf8105073aa5f3f9b9f467a9c932d4f3: Status 404 returned error can't find the container with id c966cbccdbaed7ef30b17cb3922298e3cf8105073aa5f3f9b9f467a9c932d4f3 Apr 16 18:15:32.996763 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.996723 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5641439_7e2a_42bc_ae08_1777c6dcb692.slice/crio-72623b3ca131be27f77d4327555c0ebb158490830d70382db7883e8e42ac93a2 WatchSource:0}: Error finding container 72623b3ca131be27f77d4327555c0ebb158490830d70382db7883e8e42ac93a2: Status 404 returned error can't find the container with id 72623b3ca131be27f77d4327555c0ebb158490830d70382db7883e8e42ac93a2 Apr 16 18:15:32.997625 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.997601 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d0dfe5e_9489_4a73_ae93_e266ba6c0e34.slice/crio-7244a42ca1e4b9bc59acaddd8d4b1836160f1fbe5c8ac72ab761593848d6dce8 WatchSource:0}: Error finding container 7244a42ca1e4b9bc59acaddd8d4b1836160f1fbe5c8ac72ab761593848d6dce8: Status 404 returned error can't find the container with id 7244a42ca1e4b9bc59acaddd8d4b1836160f1fbe5c8ac72ab761593848d6dce8 Apr 16 18:15:32.998399 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.998330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44520956_97f4_441e_acae_e1c5b82de2ea.slice/crio-8ea594a3b2efb6348c04d2889e42720598bbd002a7001eab2a3bd66d9d27ea94 WatchSource:0}: Error finding container 8ea594a3b2efb6348c04d2889e42720598bbd002a7001eab2a3bd66d9d27ea94: Status 404 returned error can't find the container with id 8ea594a3b2efb6348c04d2889e42720598bbd002a7001eab2a3bd66d9d27ea94 Apr 16 18:15:32.999141 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.999118 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a47f3a5_f4cc_445c_a5c0_e2a5af6a9ea3.slice/crio-080f198cd892213e07d24bd08f43f41e5621ceba8e67d504368c00c7ec7b369d WatchSource:0}: Error finding container 080f198cd892213e07d24bd08f43f41e5621ceba8e67d504368c00c7ec7b369d: Status 404 returned error can't find the container with id 080f198cd892213e07d24bd08f43f41e5621ceba8e67d504368c00c7ec7b369d Apr 16 18:15:32.999955 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:32.999935 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9125009d_485e_443b_85bb_384f2afc6de2.slice/crio-08662f1888ebef30ec599738df5b114daaf5e7b6ac9f165e834f1741c1b83421 WatchSource:0}: Error finding container 08662f1888ebef30ec599738df5b114daaf5e7b6ac9f165e834f1741c1b83421: Status 404 returned error can't find the container with id 08662f1888ebef30ec599738df5b114daaf5e7b6ac9f165e834f1741c1b83421 Apr 16 18:15:33.001261 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:33.001240 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533ef48a_edc5_4453_925d_1c6e4b8c3aa0.slice/crio-b7b0eca9d57ef6e609bd304184b5a9b9f5c4b4cbd69f5e9f8ec9ebc0e54fa85a WatchSource:0}: Error finding container b7b0eca9d57ef6e609bd304184b5a9b9f5c4b4cbd69f5e9f8ec9ebc0e54fa85a: Status 404 returned error can't find the container with id b7b0eca9d57ef6e609bd304184b5a9b9f5c4b4cbd69f5e9f8ec9ebc0e54fa85a Apr 16 18:15:33.001964 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:15:33.001813 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8cd101_1d2b_4e8c_8a93_709cc894653b.slice/crio-feb0354b5ddfa46136e4ef28aea1ea19aed58f2e3f9a028db7decefcadbf5915 WatchSource:0}: Error finding container feb0354b5ddfa46136e4ef28aea1ea19aed58f2e3f9a028db7decefcadbf5915: Status 404 returned error can't find the container with id feb0354b5ddfa46136e4ef28aea1ea19aed58f2e3f9a028db7decefcadbf5915 Apr 16 18:15:33.177655 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.177628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:33.177790 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.177739 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:33.177790 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.177751 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:33.177790 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.177760 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:33.177934 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.177802 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:34.177789599 +0000 UTC m=+4.266338989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:33.445362 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.445242 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:10:31 +0000 UTC" deadline="2027-12-29 09:04:44.30749261 +0000 UTC" Apr 16 18:15:33.445362 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.445280 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14918h49m10.862215816s" Apr 16 18:15:33.500237 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.500196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:33.500395 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.500342 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:33.513919 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.513890 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" event={"ID":"9f8cd101-1d2b-4e8c-8a93-709cc894653b","Type":"ContainerStarted","Data":"feb0354b5ddfa46136e4ef28aea1ea19aed58f2e3f9a028db7decefcadbf5915"} Apr 16 18:15:33.532680 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.532617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-82pct" event={"ID":"9125009d-485e-443b-85bb-384f2afc6de2","Type":"ContainerStarted","Data":"08662f1888ebef30ec599738df5b114daaf5e7b6ac9f165e834f1741c1b83421"} Apr 16 18:15:33.539987 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.539432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q7ctb" event={"ID":"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3","Type":"ContainerStarted","Data":"080f198cd892213e07d24bd08f43f41e5621ceba8e67d504368c00c7ec7b369d"} Apr 16 18:15:33.546636 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.546610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8694n" event={"ID":"44520956-97f4-441e-acae-e1c5b82de2ea","Type":"ContainerStarted","Data":"8ea594a3b2efb6348c04d2889e42720598bbd002a7001eab2a3bd66d9d27ea94"} Apr 16 18:15:33.552691 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.552663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"72623b3ca131be27f77d4327555c0ebb158490830d70382db7883e8e42ac93a2"} Apr 16 18:15:33.560957 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.560929 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cbp5h" event={"ID":"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34","Type":"ContainerStarted","Data":"7244a42ca1e4b9bc59acaddd8d4b1836160f1fbe5c8ac72ab761593848d6dce8"} Apr 16 18:15:33.563289 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.563265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerStarted","Data":"b7b0eca9d57ef6e609bd304184b5a9b9f5c4b4cbd69f5e9f8ec9ebc0e54fa85a"} Apr 16 18:15:33.569362 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.569337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" event={"ID":"72016a61-4b3e-4be4-9f3e-5106c032b993","Type":"ContainerStarted","Data":"c966cbccdbaed7ef30b17cb3922298e3cf8105073aa5f3f9b9f467a9c932d4f3"} Apr 16 18:15:33.573115 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.573091 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dp48w" event={"ID":"99b3e81e-b20b-4a7d-a324-a26ce7f61f58","Type":"ContainerStarted","Data":"cf9b3d349987b4fe6806060e902d411f344d5164b2d7cb04b65b2a774701ba56"} Apr 16 18:15:33.588857 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.588826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" event={"ID":"50124a769f014399bcd19764e87e11fb","Type":"ContainerStarted","Data":"b0d30b6277c4c5fa9da46a9acb60370efc73900d0b1131cbe6e44adaabb7f1b0"} Apr 16 18:15:33.988239 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:33.988205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:33.988404 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.988346 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:33.988480 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:33.988418 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:35.988398619 +0000 UTC m=+6.076948010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:34.191104 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:34.189867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:34.191104 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:34.190033 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:34.191104 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:34.190054 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:34.191104 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:34.190091 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:34.191104 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:34.190151 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:36.190132806 +0000 UTC m=+6.278682202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:34.502783 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:34.502186 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:34.502783 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:34.502310 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:34.617024 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:34.616738 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1df915e3f3dcc8bf40de1ae187fd712" containerID="0c27ac31705f6491ead677b83efa473064446ab4380fb19017ada4e0e0648448" exitCode=0 Apr 16 18:15:34.617024 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:34.616848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" event={"ID":"c1df915e3f3dcc8bf40de1ae187fd712","Type":"ContainerDied","Data":"0c27ac31705f6491ead677b83efa473064446ab4380fb19017ada4e0e0648448"} Apr 16 18:15:34.638949 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:34.638899 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-125.ec2.internal" podStartSLOduration=3.638881752 podStartE2EDuration="3.638881752s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:15:33.606788626 +0000 UTC m=+3.695338032" watchObservedRunningTime="2026-04-16 18:15:34.638881752 +0000 UTC m=+4.727431166" Apr 16 18:15:35.499942 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:35.499299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:35.499942 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:35.499484 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:35.624665 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:35.624630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" event={"ID":"c1df915e3f3dcc8bf40de1ae187fd712","Type":"ContainerStarted","Data":"9baf76de80399b00e5fee51ec3fbd65274aaebb73034cce304dccb87951b6be0"} Apr 16 18:15:36.006540 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.006505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:36.006705 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.006671 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:36.006840 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.006753 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:40.006730493 +0000 UTC m=+10.095279886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:36.208298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.208231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:36.208482 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.208384 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:36.208482 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.208403 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:36.208482 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.208416 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:36.208482 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.208475 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:40.208457763 +0000 UTC m=+10.297007161 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:36.499747 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.499660 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:36.499909 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.499783 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:36.568337 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.568240 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-125.ec2.internal" podStartSLOduration=5.568220274 podStartE2EDuration="5.568220274s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:15:35.655487412 +0000 UTC m=+5.744036827" watchObservedRunningTime="2026-04-16 18:15:36.568220274 +0000 UTC m=+6.656769690" Apr 16 18:15:36.568499 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.568414 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-k4g48"] Apr 16 18:15:36.570613 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.570584 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.571054 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.571027 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:36.611951 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.611778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.611951 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.611832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b343ce6d-77fa-4692-ab8c-61876213aed4-dbus\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.611951 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.611867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b343ce6d-77fa-4692-ab8c-61876213aed4-kubelet-config\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.713020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.713107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b343ce6d-77fa-4692-ab8c-61876213aed4-dbus\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.713149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b343ce6d-77fa-4692-ab8c-61876213aed4-kubelet-config\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.713289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b343ce6d-77fa-4692-ab8c-61876213aed4-kubelet-config\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.713396 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:36.713458 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret podName:b343ce6d-77fa-4692-ab8c-61876213aed4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:37.213432823 +0000 UTC m=+7.301982217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret") pod "global-pull-secret-syncer-k4g48" (UID: "b343ce6d-77fa-4692-ab8c-61876213aed4") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:36.713849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:36.713791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b343ce6d-77fa-4692-ab8c-61876213aed4-dbus\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:37.217768 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:37.217737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:37.217943 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:37.217912 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:37.218001 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:37.217986 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret podName:b343ce6d-77fa-4692-ab8c-61876213aed4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:38.217966879 +0000 UTC m=+8.306516283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret") pod "global-pull-secret-syncer-k4g48" (UID: "b343ce6d-77fa-4692-ab8c-61876213aed4") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:37.499446 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:37.499312 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:37.499603 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:37.499451 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:38.225875 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:38.225675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:38.225875 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:38.225859 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:38.226400 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:38.225927 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret podName:b343ce6d-77fa-4692-ab8c-61876213aed4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:40.225908482 +0000 UTC m=+10.314457887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret") pod "global-pull-secret-syncer-k4g48" (UID: "b343ce6d-77fa-4692-ab8c-61876213aed4") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:38.500169 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:38.500093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:38.500333 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:38.500233 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:38.500703 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:38.500093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:38.500703 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:38.500671 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:39.500365 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:39.500037 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:39.500365 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:39.500201 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:40.037603 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:40.037412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:40.037771 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.037615 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:40.037771 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.037683 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:48.037665272 +0000 UTC m=+18.126214667 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:40.238924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:40.239019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.239160 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.239172 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.239187 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.239200 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.239227 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret podName:b343ce6d-77fa-4692-ab8c-61876213aed4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:44.239207405 +0000 UTC m=+14.327756808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret") pod "global-pull-secret-syncer-k4g48" (UID: "b343ce6d-77fa-4692-ab8c-61876213aed4") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:40.239241 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.239243 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:48.239235477 +0000 UTC m=+18.327784868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:40.500287 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:40.500211 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:40.500424 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.500326 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:40.500424 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:40.500376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:40.500827 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:40.500470 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:41.499764 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:41.499733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:41.499926 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:41.499848 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:42.499550 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:42.499515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:42.500035 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:42.499643 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:42.500035 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:42.499716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:42.500035 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:42.499821 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:43.499859 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:43.499833 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:43.500264 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:43.499948 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:44.272361 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:44.272323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:44.272522 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:44.272445 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:44.272522 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:44.272507 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret podName:b343ce6d-77fa-4692-ab8c-61876213aed4 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:52.272490801 +0000 UTC m=+22.361040197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret") pod "global-pull-secret-syncer-k4g48" (UID: "b343ce6d-77fa-4692-ab8c-61876213aed4") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:44.499876 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:44.499841 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:44.500296 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:44.499841 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:44.500296 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:44.499951 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:44.500296 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:44.500048 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:45.500109 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:45.500068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:45.500554 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:45.500218 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:46.499975 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:46.499940 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:46.500172 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:46.500069 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:46.500172 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:46.500138 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:46.500558 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:46.500246 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:47.499906 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:47.499871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:47.500106 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:47.500009 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:48.100323 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:48.100273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:48.100774 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.100448 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:48.100774 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.100532 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.100509822 +0000 UTC m=+34.189059249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:15:48.301961 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:48.301919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:48.302136 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.302057 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:15:48.302136 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.302072 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:15:48.302136 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.302093 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:48.302270 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.302153 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.302135033 +0000 UTC m=+34.390684433 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:15:48.500089 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:48.499963 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:48.500355 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:48.500110 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:48.500355 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.500213 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:48.500355 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:48.500113 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:49.499386 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:49.499359 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:49.499748 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:49.499464 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:50.499942 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.499724 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:50.500561 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.499786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:50.500561 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:50.500002 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:50.500561 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:50.500098 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:50.651033 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.651007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" event={"ID":"9f8cd101-1d2b-4e8c-8a93-709cc894653b","Type":"ContainerStarted","Data":"393a89ba3605b867bff764af3c1abe0f0e892fac1b9bc1ea8c982c015641a701"} Apr 16 18:15:50.652325 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.652300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-82pct" event={"ID":"9125009d-485e-443b-85bb-384f2afc6de2","Type":"ContainerStarted","Data":"cd309e85c1621365e2493a62d286f0b7fa3c4557037d1d4c017c7036c52b907b"} Apr 16 18:15:50.653570 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.653544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q7ctb" event={"ID":"7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3","Type":"ContainerStarted","Data":"9917a2fe800299042b5aabcf84269a562c01f529e1cb032b43f47c0b986a5276"} Apr 16 18:15:50.654770 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.654740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8694n" event={"ID":"44520956-97f4-441e-acae-e1c5b82de2ea","Type":"ContainerStarted","Data":"604ce2fbfa391158a2aadde3837f110a594dbebf0123ba9c18ae435caf5d25a4"} Apr 16 18:15:50.657244 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657224 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:15:50.657673 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657582 2573 generic.go:358] "Generic (PLEG): container finished" podID="c5641439-7e2a-42bc-ae08-1777c6dcb692" containerID="72cd590ede1ba74d654e311eb2997a339adc1566aec52d6da4dae0c60aff05f7" exitCode=1 Apr 16 18:15:50.657673 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"a04850391b58498f72dd2dfd37c031cf77fd3d82bc3bdfde78d58b80def885d3"} Apr 16 18:15:50.657673 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"26b447a39b758017536d5ee73fd3ce8d29eca97b42ddf6207fae4824477ac1a2"} Apr 16 18:15:50.657849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"68ea84a5022eebfd2aea3d4bfb65d767021fb97a09e8e1946a7e920bf018e372"} Apr 16 18:15:50.657849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerDied","Data":"72cd590ede1ba74d654e311eb2997a339adc1566aec52d6da4dae0c60aff05f7"} Apr 16 18:15:50.657849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.657709 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"b43c83adc2a21385260744862c669ebee1d81df95f729f06d8959408aa82b9f6"} Apr 16 18:15:50.658987 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.658964 2573 generic.go:358] "Generic (PLEG): container finished" podID="533ef48a-edc5-4453-925d-1c6e4b8c3aa0" containerID="80d7f98817686027f1a1e336528d23e535e091da1425c92e642e7631231d35e6" exitCode=0 Apr 16 18:15:50.659100 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.659029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerDied","Data":"80d7f98817686027f1a1e336528d23e535e091da1425c92e642e7631231d35e6"} Apr 16 18:15:50.661647 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.661624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" event={"ID":"72016a61-4b3e-4be4-9f3e-5106c032b993","Type":"ContainerStarted","Data":"d26b11c0eb69357a6c246aebc7298c53234ac53e7b1099eb64d75c70bdfbf78c"} Apr 16 18:15:50.663754 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.663729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dp48w" event={"ID":"99b3e81e-b20b-4a7d-a324-a26ce7f61f58","Type":"ContainerStarted","Data":"18a6ae8adbb99abd6dd5860480dfaf087e86ba44ce71a2dd99659fd4347c77e6"} Apr 16 18:15:50.666749 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.666700 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-82pct" podStartSLOduration=3.733661827 podStartE2EDuration="20.666689124s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:33.002364444 +0000 UTC m=+3.090913840" lastFinishedPulling="2026-04-16 18:15:49.935391734 +0000 UTC m=+20.023941137" observedRunningTime="2026-04-16 18:15:50.666291509 +0000 UTC m=+20.754840922" watchObservedRunningTime="2026-04-16 18:15:50.666689124 +0000 UTC m=+20.755238537" Apr 16 18:15:50.698454 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.698414 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8694n" podStartSLOduration=3.781982943 podStartE2EDuration="20.69840058s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:33.000153047 +0000 UTC m=+3.088702477" lastFinishedPulling="2026-04-16 18:15:49.916570722 +0000 UTC m=+20.005120114" observedRunningTime="2026-04-16 18:15:50.698278899 +0000 UTC m=+20.786828312" watchObservedRunningTime="2026-04-16 18:15:50.69840058 +0000 UTC m=+20.786949999" Apr 16 18:15:50.718188 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.718149 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q7ctb" podStartSLOduration=11.751776199 podStartE2EDuration="20.718136839s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:33.001048721 +0000 UTC m=+3.089598137" lastFinishedPulling="2026-04-16 18:15:41.967409385 +0000 UTC m=+12.055958777" observedRunningTime="2026-04-16 18:15:50.717931618 +0000 UTC m=+20.806481032" watchObservedRunningTime="2026-04-16 18:15:50.718136839 +0000 UTC m=+20.806686252" Apr 16 18:15:50.734604 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.734560 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jp5k9" podStartSLOduration=3.767448225 podStartE2EDuration="20.734544339s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.995764161 +0000 UTC m=+3.084313552" lastFinishedPulling="2026-04-16 18:15:49.962860261 +0000 UTC m=+20.051409666" observedRunningTime="2026-04-16 18:15:50.7343493 +0000 UTC m=+20.822898714" watchObservedRunningTime="2026-04-16 18:15:50.734544339 +0000 UTC m=+20.823093782" Apr 16 18:15:50.750238 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:50.750195 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dp48w" podStartSLOduration=3.762940843 podStartE2EDuration="20.750179028s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.99465699 +0000 UTC m=+3.083206387" lastFinishedPulling="2026-04-16 18:15:49.981895176 +0000 UTC m=+20.070444572" observedRunningTime="2026-04-16 18:15:50.749501571 +0000 UTC m=+20.838050983" watchObservedRunningTime="2026-04-16 18:15:50.750179028 +0000 UTC m=+20.838728442" Apr 16 18:15:51.208983 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.208960 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:15:51.443744 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.443653 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:15:51.20898138Z","UUID":"e56ff53b-2cc1-4914-9152-f6de4c8392d9","Handler":null,"Name":"","Endpoint":""} Apr 16 18:15:51.445364 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.445331 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:15:51.445517 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.445371 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:15:51.499971 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.499944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:51.500351 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:51.500102 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:51.668589 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.668564 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:15:51.669117 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.669066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"698ee19541c9af2743a134c868155cabb33fb2ecad6906d56303939bce34a11e"} Apr 16 18:15:51.670620 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.670594 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cbp5h" event={"ID":"6d0dfe5e-9489-4a73-ae93-e266ba6c0e34","Type":"ContainerStarted","Data":"aaae0109352c0a1d337983227e28c11d281f5621c3b0e72313ad52b9afa77a39"} Apr 16 18:15:51.673094 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.672589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" event={"ID":"9f8cd101-1d2b-4e8c-8a93-709cc894653b","Type":"ContainerStarted","Data":"6acedb2b0aa4b6c7707d24d62def2db65cf2196b95f4bcd863fd6a5908dffde7"} Apr 16 18:15:51.690160 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:51.690116 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cbp5h" podStartSLOduration=5.091881646 podStartE2EDuration="21.690100687s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.999367563 +0000 UTC m=+3.087916954" lastFinishedPulling="2026-04-16 18:15:49.597586601 +0000 UTC m=+19.686135995" observedRunningTime="2026-04-16 18:15:51.690044372 +0000 UTC m=+21.778593786" watchObservedRunningTime="2026-04-16 18:15:51.690100687 +0000 UTC m=+21.778650100" Apr 16 18:15:52.334529 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.334332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:52.334675 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:52.334462 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:52.334675 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:52.334613 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret podName:b343ce6d-77fa-4692-ab8c-61876213aed4 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:08.334597569 +0000 UTC m=+38.423146963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret") pod "global-pull-secret-syncer-k4g48" (UID: "b343ce6d-77fa-4692-ab8c-61876213aed4") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:15:52.352404 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.352380 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:52.352965 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.352947 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:52.499373 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.499294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:52.499373 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.499325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:52.499697 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:52.499426 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:52.499697 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:52.499516 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:52.677032 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.676992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" event={"ID":"9f8cd101-1d2b-4e8c-8a93-709cc894653b","Type":"ContainerStarted","Data":"31c7fd98fcfcf73f52a6640e35cf332b7ee9bf5af595d50665630941df803798"} Apr 16 18:15:52.677541 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.677518 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:52.678069 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.678052 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q7ctb" Apr 16 18:15:52.706243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:52.706202 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6ktxj" podStartSLOduration=3.472805036 podStartE2EDuration="22.706189222s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:33.003303178 +0000 UTC m=+3.091852575" lastFinishedPulling="2026-04-16 18:15:52.236687357 +0000 UTC m=+22.325236761" observedRunningTime="2026-04-16 18:15:52.692569068 +0000 UTC m=+22.781118485" watchObservedRunningTime="2026-04-16 18:15:52.706189222 +0000 UTC m=+22.794738634" Apr 16 18:15:53.499212 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:53.499183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:53.499349 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:53.499308 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:53.681776 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:53.681748 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:15:53.682362 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:53.682100 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"dcc4f3621263ed8caa981a68f3e4ced60337b8a279074c0454c007d8b8f6ee7d"} Apr 16 18:15:54.500174 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:54.500119 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:54.500335 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:54.500259 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:54.500335 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:54.500317 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:54.500456 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:54.500436 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:55.500286 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.500061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:55.500820 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:55.500312 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:55.687960 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.687935 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:15:55.688266 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.688244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"2a3dcb3f12a3bbe37327a712230036146cef2e9e95033c7b414dbb242b450dfb"} Apr 16 18:15:55.688583 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.688560 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:55.688721 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.688703 2573 scope.go:117] "RemoveContainer" containerID="72cd590ede1ba74d654e311eb2997a339adc1566aec52d6da4dae0c60aff05f7" Apr 16 18:15:55.688781 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.688733 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:55.688781 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.688754 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:55.690207 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.690186 2573 generic.go:358] "Generic (PLEG): container finished" podID="533ef48a-edc5-4453-925d-1c6e4b8c3aa0" containerID="fd04a699e4cdde98381f72ba816c4e35a17d38d5f086bb7694c9c1ec18be1229" exitCode=0 Apr 16 18:15:55.690299 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.690222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerDied","Data":"fd04a699e4cdde98381f72ba816c4e35a17d38d5f086bb7694c9c1ec18be1229"} Apr 16 18:15:55.703641 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.703618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:55.704029 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:55.704014 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:15:56.499484 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.499459 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:56.499582 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.499473 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:56.499582 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:56.499560 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:56.499694 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:56.499629 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:56.695815 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.695629 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:15:56.696210 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.696159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" event={"ID":"c5641439-7e2a-42bc-ae08-1777c6dcb692","Type":"ContainerStarted","Data":"fafb67e83af578baa87ab793250fc0ff1c17fb49b98d1d56be848f62c15d3de4"} Apr 16 18:15:56.698192 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.698162 2573 generic.go:358] "Generic (PLEG): container finished" podID="533ef48a-edc5-4453-925d-1c6e4b8c3aa0" containerID="89f076df82f15944f39270a90447d45870da2520c8c8d148ead00342168edb4b" exitCode=0 Apr 16 18:15:56.698320 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.698218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerDied","Data":"89f076df82f15944f39270a90447d45870da2520c8c8d148ead00342168edb4b"} Apr 16 18:15:56.725174 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.725124 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" podStartSLOduration=8.709403731 podStartE2EDuration="25.725106337s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="2026-04-16 18:15:32.99845818 +0000 UTC m=+3.087007574" lastFinishedPulling="2026-04-16 18:15:50.014160777 +0000 UTC m=+20.102710180" observedRunningTime="2026-04-16 18:15:56.724768076 +0000 UTC m=+26.813317488" watchObservedRunningTime="2026-04-16 18:15:56.725106337 +0000 UTC m=+26.813655754" Apr 16 18:15:56.983172 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.983091 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k4g48"] Apr 16 18:15:56.983365 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.983220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:56.983365 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:56.983326 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:56.987244 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.987219 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2xsjc"] Apr 16 18:15:56.987361 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.987341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:56.987456 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:56.987436 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:15:56.987867 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.987840 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7rqfn"] Apr 16 18:15:56.987970 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:56.987957 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:56.988104 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:56.988056 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:57.701810 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:57.701777 2573 generic.go:358] "Generic (PLEG): container finished" podID="533ef48a-edc5-4453-925d-1c6e4b8c3aa0" containerID="b916da2de877cfa91d480547a6ff08b98224d39f426704fede7b8bcbf1b828af" exitCode=0 Apr 16 18:15:57.702466 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:57.701871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerDied","Data":"b916da2de877cfa91d480547a6ff08b98224d39f426704fede7b8bcbf1b828af"} Apr 16 18:15:58.499551 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:58.499508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:15:58.499717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:58.499508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:15:58.499717 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:58.499651 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:15:58.499717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:15:58.499508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:15:58.499882 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:58.499773 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:15:58.499882 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:15:58.499788 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:16:00.500790 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:00.500736 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:16:00.501556 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:00.500838 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:00.501556 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:00.500869 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:16:00.501556 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:00.500900 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:16:00.501556 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:00.500940 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:16:00.501556 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:00.501040 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:16:02.499424 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:02.499390 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:16:02.499843 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:02.499428 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:02.499843 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:02.499541 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:16:02.499843 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:02.499610 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2xsjc" podUID="0e9adccb-918e-48c1-97ed-0c0d8728a3f4" Apr 16 18:16:02.499843 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:02.499643 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:16:02.499843 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:02.499718 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k4g48" podUID="b343ce6d-77fa-4692-ab8c-61876213aed4" Apr 16 18:16:03.205477 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.205448 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-125.ec2.internal" event="NodeReady" Apr 16 18:16:03.205661 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.205592 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:16:03.255310 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.255037 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xbf7w"] Apr 16 18:16:03.275206 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.275133 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4rlf8"] Apr 16 18:16:03.275341 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.275313 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.277938 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.277914 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fxg25\"" Apr 16 18:16:03.278050 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.277962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:16:03.278168 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.278135 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:16:03.286774 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.286751 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xbf7w"] Apr 16 18:16:03.286880 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.286780 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4rlf8"] Apr 16 18:16:03.286933 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.286884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:03.289878 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.289854 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:16:03.289984 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.289879 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:16:03.289984 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.289899 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4scgb\"" Apr 16 18:16:03.289984 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.289899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:16:03.417994 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.417963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52wd\" (UniqueName: \"kubernetes.io/projected/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-kube-api-access-s52wd\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.417994 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.417997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v97m\" (UniqueName: \"kubernetes.io/projected/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-kube-api-access-6v97m\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:03.418212 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.418086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.418212 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.418115 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:03.418212 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.418155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-tmp-dir\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.418348 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.418261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-config-volume\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.518622 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-tmp-dir\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.518622 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-config-volume\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.518622 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s52wd\" (UniqueName: \"kubernetes.io/projected/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-kube-api-access-s52wd\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6v97m\" (UniqueName: \"kubernetes.io/projected/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-kube-api-access-6v97m\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:03.518784 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:03.518791 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:03.518841 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.018822169 +0000 UTC m=+34.107371576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.518849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-tmp-dir\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.519131 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:03.518858 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:16:04.018849762 +0000 UTC m=+34.107399159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:03.519385 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.519281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-config-volume\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.529964 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.529946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52wd\" (UniqueName: \"kubernetes.io/projected/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-kube-api-access-s52wd\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:03.530089 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:03.530005 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v97m\" (UniqueName: \"kubernetes.io/projected/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-kube-api-access-6v97m\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:04.021433 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.021387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:04.021648 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.021440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:04.021648 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.021538 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:04.021648 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.021584 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:04.021648 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.021602 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:16:05.021586956 +0000 UTC m=+35.110136347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:04.021648 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.021631 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:05.021619078 +0000 UTC m=+35.110168474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:04.122313 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.122279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:16:04.122498 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.122421 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:04.122498 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.122482 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:36.122468671 +0000 UTC m=+66.211018062 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:16:04.324708 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.324679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:04.324926 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.324803 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:16:04.324926 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.324816 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:16:04.324926 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.324825 2573 projected.go:194] Error preparing data for projected volume kube-api-access-2hd5d for pod openshift-network-diagnostics/network-check-target-2xsjc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:04.324926 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:04.324870 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d podName:0e9adccb-918e-48c1-97ed-0c0d8728a3f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:36.324858501 +0000 UTC m=+66.413407896 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2hd5d" (UniqueName: "kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d") pod "network-check-target-2xsjc" (UID: "0e9adccb-918e-48c1-97ed-0c0d8728a3f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:16:04.499517 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.499441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:04.499517 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.499461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:16:04.499670 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.499558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:16:04.502436 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.502419 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:16:04.502548 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.502421 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:16:04.503603 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.503583 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4plx6\"" Apr 16 18:16:04.503603 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.503599 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:16:04.503755 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.503708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xl8bl\"" Apr 16 18:16:04.503813 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.503753 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:16:04.717288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.717257 2573 generic.go:358] "Generic (PLEG): container finished" podID="533ef48a-edc5-4453-925d-1c6e4b8c3aa0" containerID="875cf49be9eb15bf91ebf60706bc09e9a420b43d4ff17e393ec53fcd68825c48" exitCode=0 Apr 16 18:16:04.717632 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:04.717334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerDied","Data":"875cf49be9eb15bf91ebf60706bc09e9a420b43d4ff17e393ec53fcd68825c48"} Apr 16 18:16:05.030608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:05.030513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:05.030608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:05.030553 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:05.030807 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:05.030655 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:05.030807 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:05.030659 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:05.030807 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:05.030710 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:07.030694102 +0000 UTC m=+37.119243493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:05.030807 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:05.030722 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:16:07.030716696 +0000 UTC m=+37.119266087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:05.721879 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:05.721848 2573 generic.go:358] "Generic (PLEG): container finished" podID="533ef48a-edc5-4453-925d-1c6e4b8c3aa0" containerID="c610910bb3f85f9bf443a414006e5077b4975c81eddd37105db238e31621f9b5" exitCode=0 Apr 16 18:16:05.722284 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:05.721891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerDied","Data":"c610910bb3f85f9bf443a414006e5077b4975c81eddd37105db238e31621f9b5"} Apr 16 18:16:06.726234 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:06.726200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xflf7" event={"ID":"533ef48a-edc5-4453-925d-1c6e4b8c3aa0","Type":"ContainerStarted","Data":"aa31f276675f9da58677689bad575442bed986c802f07713ea3b798a524dee67"} Apr 16 18:16:06.751251 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:06.751193 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xflf7" podStartSLOduration=6.138996386 podStartE2EDuration="36.751179131s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:33.004197827 +0000 UTC m=+3.092747218" lastFinishedPulling="2026-04-16 18:16:03.616380563 +0000 UTC m=+33.704929963" observedRunningTime="2026-04-16 18:16:06.749423437 +0000 UTC m=+36.837972839" watchObservedRunningTime="2026-04-16 18:16:06.751179131 +0000 UTC m=+36.839728545" Apr 16 18:16:07.045434 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:07.045389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:07.045434 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:07.045440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:07.045667 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:07.045530 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:07.045667 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:07.045590 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:16:11.045576291 +0000 UTC m=+41.134125682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:07.045667 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:07.045537 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:07.045667 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:07.045671 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:11.045652732 +0000 UTC m=+41.134202126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:08.355633 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:08.355602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:16:08.358814 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:08.358779 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b343ce6d-77fa-4692-ab8c-61876213aed4-original-pull-secret\") pod \"global-pull-secret-syncer-k4g48\" (UID: \"b343ce6d-77fa-4692-ab8c-61876213aed4\") " pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:16:08.419218 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:08.419189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k4g48" Apr 16 18:16:08.562039 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:08.561989 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k4g48"] Apr 16 18:16:08.567641 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:16:08.567614 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb343ce6d_77fa_4692_ab8c_61876213aed4.slice/crio-c221623ca206f666f8ea0471978abf70a8e6f75fc15a86286ae8a8c8cb425aff WatchSource:0}: Error finding container c221623ca206f666f8ea0471978abf70a8e6f75fc15a86286ae8a8c8cb425aff: Status 404 returned error can't find the container with id c221623ca206f666f8ea0471978abf70a8e6f75fc15a86286ae8a8c8cb425aff Apr 16 18:16:08.730039 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:08.729959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k4g48" event={"ID":"b343ce6d-77fa-4692-ab8c-61876213aed4","Type":"ContainerStarted","Data":"c221623ca206f666f8ea0471978abf70a8e6f75fc15a86286ae8a8c8cb425aff"} Apr 16 18:16:11.076905 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:11.076872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:11.076905 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:11.076909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:11.077345 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:11.077010 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:11.077345 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:11.077020 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:11.077345 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:11.077068 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:19.07705482 +0000 UTC m=+49.165604211 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:11.077345 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:11.077104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:16:19.07707319 +0000 UTC m=+49.165622586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:12.739190 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:12.739104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k4g48" event={"ID":"b343ce6d-77fa-4692-ab8c-61876213aed4","Type":"ContainerStarted","Data":"83cd95c0086b54248e842b1dd3ea9f51e673f7d33be3fed36faccbd95d032d29"} Apr 16 18:16:12.754060 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:12.754025 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-k4g48" podStartSLOduration=32.914200373 podStartE2EDuration="36.754011688s" podCreationTimestamp="2026-04-16 18:15:36 +0000 UTC" firstStartedPulling="2026-04-16 18:16:08.569627673 +0000 UTC m=+38.658177064" lastFinishedPulling="2026-04-16 18:16:12.409438986 +0000 UTC m=+42.497988379" observedRunningTime="2026-04-16 18:16:12.753377217 +0000 UTC m=+42.841926630" watchObservedRunningTime="2026-04-16 18:16:12.754011688 +0000 UTC m=+42.842561079" Apr 16 18:16:19.135406 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:19.135364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:19.135406 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:19.135406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:19.135899 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:19.135501 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:19.135899 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:19.135503 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:19.135899 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:19.135552 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:16:35.135539477 +0000 UTC m=+65.224088867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:19.135899 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:19.135565 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:16:35.135559128 +0000 UTC m=+65.224108519 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:27.719475 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:27.719442 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25kmv" Apr 16 18:16:35.139761 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:35.139713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:16:35.139761 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:35.139763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:16:35.140290 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:35.139871 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:16:35.140290 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:35.139938 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:07.13992157 +0000 UTC m=+97.228470964 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:16:35.140290 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:35.139963 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:16:35.140290 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:35.140025 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:07.140006501 +0000 UTC m=+97.228555895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:16:36.146347 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.146296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:16:36.149614 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.149593 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:16:36.156920 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:36.156897 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:16:36.157013 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:16:36.156979 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.156958472 +0000 UTC m=+130.245507863 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : secret "metrics-daemon-secret" not found Apr 16 18:16:36.348742 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.348694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:36.351455 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.351437 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:16:36.362089 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.362061 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:16:36.372578 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.372550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hd5d\" (UniqueName: \"kubernetes.io/projected/0e9adccb-918e-48c1-97ed-0c0d8728a3f4-kube-api-access-2hd5d\") pod \"network-check-target-2xsjc\" (UID: \"0e9adccb-918e-48c1-97ed-0c0d8728a3f4\") " pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:36.612110 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.612066 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xl8bl\"" Apr 16 18:16:36.620317 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.620297 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:36.731902 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.731871 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2xsjc"] Apr 16 18:16:36.735706 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:16:36.735672 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9adccb_918e_48c1_97ed_0c0d8728a3f4.slice/crio-05226a6ab34348bdbc55d32f4d9ba98207684205f95c1640cf8159a5f3bd415e WatchSource:0}: Error finding container 05226a6ab34348bdbc55d32f4d9ba98207684205f95c1640cf8159a5f3bd415e: Status 404 returned error can't find the container with id 05226a6ab34348bdbc55d32f4d9ba98207684205f95c1640cf8159a5f3bd415e Apr 16 18:16:36.785823 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:36.785797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2xsjc" event={"ID":"0e9adccb-918e-48c1-97ed-0c0d8728a3f4","Type":"ContainerStarted","Data":"05226a6ab34348bdbc55d32f4d9ba98207684205f95c1640cf8159a5f3bd415e"} Apr 16 18:16:39.792563 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:39.792523 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2xsjc" event={"ID":"0e9adccb-918e-48c1-97ed-0c0d8728a3f4","Type":"ContainerStarted","Data":"186ed1fb7b5c2ea6a7039104b8f22c11989d550752af4cf48a3abcd4ba9d7e9a"} Apr 16 18:16:39.792991 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:39.792777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:16:39.809705 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:39.809650 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2xsjc" podStartSLOduration=66.297428408 podStartE2EDuration="1m8.809634243s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="2026-04-16 18:16:36.737435927 +0000 UTC m=+66.825985318" lastFinishedPulling="2026-04-16 18:16:39.249641752 +0000 UTC m=+69.338191153" observedRunningTime="2026-04-16 18:16:39.809595672 +0000 UTC m=+69.898145084" watchObservedRunningTime="2026-04-16 18:16:39.809634243 +0000 UTC m=+69.898183635" Apr 16 18:16:45.973501 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.973472 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw"] Apr 16 18:16:45.975433 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.975413 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:45.978996 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.978974 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:16:45.978996 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.978996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-zsgrd\"" Apr 16 18:16:45.979199 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.979013 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:16:45.979199 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.978975 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:16:45.979199 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.978996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:16:45.985618 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:45.985602 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw"] Apr 16 18:16:46.020560 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.020534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e8f63555-8447-4e67-8467-a4d4c7120573-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f69c4965c-g9ppw\" (UID: \"e8f63555-8447-4e67-8467-a4d4c7120573\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.020702 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.020562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsh8\" (UniqueName: \"kubernetes.io/projected/e8f63555-8447-4e67-8467-a4d4c7120573-kube-api-access-6vsh8\") pod \"managed-serviceaccount-addon-agent-f69c4965c-g9ppw\" (UID: \"e8f63555-8447-4e67-8467-a4d4c7120573\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.033318 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.033294 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f"] Apr 16 18:16:46.035087 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.035063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.037642 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.037628 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:16:46.045740 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.045720 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f"] Apr 16 18:16:46.121202 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.121177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfgdz\" (UniqueName: \"kubernetes.io/projected/e49e7e63-d20a-48ab-864b-a42b351bd061-kube-api-access-nfgdz\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.121320 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.121231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e49e7e63-d20a-48ab-864b-a42b351bd061-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.121320 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.121259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e8f63555-8447-4e67-8467-a4d4c7120573-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f69c4965c-g9ppw\" (UID: \"e8f63555-8447-4e67-8467-a4d4c7120573\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.121320 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.121276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsh8\" (UniqueName: \"kubernetes.io/projected/e8f63555-8447-4e67-8467-a4d4c7120573-kube-api-access-6vsh8\") pod \"managed-serviceaccount-addon-agent-f69c4965c-g9ppw\" (UID: \"e8f63555-8447-4e67-8467-a4d4c7120573\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.121320 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.121299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49e7e63-d20a-48ab-864b-a42b351bd061-tmp\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.123645 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.123622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e8f63555-8447-4e67-8467-a4d4c7120573-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f69c4965c-g9ppw\" (UID: \"e8f63555-8447-4e67-8467-a4d4c7120573\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.130232 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.130211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsh8\" (UniqueName: \"kubernetes.io/projected/e8f63555-8447-4e67-8467-a4d4c7120573-kube-api-access-6vsh8\") pod \"managed-serviceaccount-addon-agent-f69c4965c-g9ppw\" (UID: \"e8f63555-8447-4e67-8467-a4d4c7120573\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.222159 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.222131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e49e7e63-d20a-48ab-864b-a42b351bd061-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.222256 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.222172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49e7e63-d20a-48ab-864b-a42b351bd061-tmp\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.222256 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.222200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfgdz\" (UniqueName: \"kubernetes.io/projected/e49e7e63-d20a-48ab-864b-a42b351bd061-kube-api-access-nfgdz\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.222517 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.222499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e49e7e63-d20a-48ab-864b-a42b351bd061-tmp\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.230809 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.230756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e49e7e63-d20a-48ab-864b-a42b351bd061-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.234273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.234252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfgdz\" (UniqueName: \"kubernetes.io/projected/e49e7e63-d20a-48ab-864b-a42b351bd061-kube-api-access-nfgdz\") pod \"klusterlet-addon-workmgr-7cd69fbbfd-s2t5f\" (UID: \"e49e7e63-d20a-48ab-864b-a42b351bd061\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.293042 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.293016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" Apr 16 18:16:46.343224 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.343183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:46.409694 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.409648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw"] Apr 16 18:16:46.413976 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:16:46.413948 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8f63555_8447_4e67_8467_a4d4c7120573.slice/crio-58dcaf80ebf560605b715bf8819b9ec2f32218c85774f8e79abeece9440c078e WatchSource:0}: Error finding container 58dcaf80ebf560605b715bf8819b9ec2f32218c85774f8e79abeece9440c078e: Status 404 returned error can't find the container with id 58dcaf80ebf560605b715bf8819b9ec2f32218c85774f8e79abeece9440c078e Apr 16 18:16:46.462382 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.462357 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f"] Apr 16 18:16:46.465372 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:16:46.465345 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49e7e63_d20a_48ab_864b_a42b351bd061.slice/crio-e397b5a6264cd8767f0bff4a0836e254669ea272fd4db45c046f6890f173c941 WatchSource:0}: Error finding container e397b5a6264cd8767f0bff4a0836e254669ea272fd4db45c046f6890f173c941: Status 404 returned error can't find the container with id e397b5a6264cd8767f0bff4a0836e254669ea272fd4db45c046f6890f173c941 Apr 16 18:16:46.805683 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.805647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" event={"ID":"e49e7e63-d20a-48ab-864b-a42b351bd061","Type":"ContainerStarted","Data":"e397b5a6264cd8767f0bff4a0836e254669ea272fd4db45c046f6890f173c941"} Apr 16 18:16:46.806738 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:46.806713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" event={"ID":"e8f63555-8447-4e67-8467-a4d4c7120573","Type":"ContainerStarted","Data":"58dcaf80ebf560605b715bf8819b9ec2f32218c85774f8e79abeece9440c078e"} Apr 16 18:16:49.814712 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:49.814677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" event={"ID":"e8f63555-8447-4e67-8467-a4d4c7120573","Type":"ContainerStarted","Data":"5fa9b7c21b6f0ddde6a2e93e950a83a0d6d69f39cf8488e2735a939147ff1c8c"} Apr 16 18:16:49.832093 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:49.831988 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f69c4965c-g9ppw" podStartSLOduration=2.188281939 podStartE2EDuration="4.831974178s" podCreationTimestamp="2026-04-16 18:16:45 +0000 UTC" firstStartedPulling="2026-04-16 18:16:46.416032541 +0000 UTC m=+76.504581933" lastFinishedPulling="2026-04-16 18:16:49.05972478 +0000 UTC m=+79.148274172" observedRunningTime="2026-04-16 18:16:49.831779547 +0000 UTC m=+79.920328959" watchObservedRunningTime="2026-04-16 18:16:49.831974178 +0000 UTC m=+79.920523591" Apr 16 18:16:50.817432 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:50.817395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" event={"ID":"e49e7e63-d20a-48ab-864b-a42b351bd061","Type":"ContainerStarted","Data":"79941e9e0db7f1faa7974877252c6decf8e0937c5e0f880d090cbdf9665f8d61"} Apr 16 18:16:50.839940 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:50.839902 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" podStartSLOduration=0.941284 podStartE2EDuration="4.839888904s" podCreationTimestamp="2026-04-16 18:16:46 +0000 UTC" firstStartedPulling="2026-04-16 18:16:46.467006021 +0000 UTC m=+76.555555416" lastFinishedPulling="2026-04-16 18:16:50.365610926 +0000 UTC m=+80.454160320" observedRunningTime="2026-04-16 18:16:50.838879092 +0000 UTC m=+80.927428506" watchObservedRunningTime="2026-04-16 18:16:50.839888904 +0000 UTC m=+80.928438340" Apr 16 18:16:51.819892 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:51.819855 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:16:51.821511 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:16:51.821492 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cd69fbbfd-s2t5f" Apr 16 18:17:07.166692 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:07.166644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:17:07.166692 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:07.166698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:17:07.167192 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:07.166779 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:07.167192 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:07.166839 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert podName:61e9f3e6-a9f4-46cd-b1c0-cbdc48158979 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.166824989 +0000 UTC m=+161.255374379 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert") pod "ingress-canary-4rlf8" (UID: "61e9f3e6-a9f4-46cd-b1c0-cbdc48158979") : secret "canary-serving-cert" not found Apr 16 18:17:07.167192 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:07.166777 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:07.167192 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:07.166922 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls podName:d9480c7a-7ea2-4833-a0ed-10e03e4bc66d nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.166908289 +0000 UTC m=+161.255457685 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls") pod "dns-default-xbf7w" (UID: "d9480c7a-7ea2-4833-a0ed-10e03e4bc66d") : secret "dns-default-metrics-tls" not found Apr 16 18:17:10.797169 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:10.797140 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2xsjc" Apr 16 18:17:40.188311 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:40.188275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:17:40.188787 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:40.188422 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:40.188787 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:40.188516 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs podName:1c559edc-905d-4a66-b3b6-e4767670d083 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:42.18849939 +0000 UTC m=+252.277048781 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs") pod "network-metrics-daemon-7rqfn" (UID: "1c559edc-905d-4a66-b3b6-e4767670d083") : secret "metrics-daemon-secret" not found Apr 16 18:17:49.050919 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.050885 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm"] Apr 16 18:17:49.052733 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.052719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.055308 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.055285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:17:49.055435 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.055345 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:49.055435 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.055407 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-q788n\"" Apr 16 18:17:49.056765 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.056744 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:49.065875 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.065854 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm"] Apr 16 18:17:49.147135 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.147073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.147135 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.147143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wr5l\" (UniqueName: \"kubernetes.io/projected/93cf13c9-c280-49b0-9865-5917d5bf1263-kube-api-access-4wr5l\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.248351 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.248314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.248351 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.248356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wr5l\" (UniqueName: \"kubernetes.io/projected/93cf13c9-c280-49b0-9865-5917d5bf1263-kube-api-access-4wr5l\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.248568 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:49.248464 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:17:49.248568 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:49.248527 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls podName:93cf13c9-c280-49b0-9865-5917d5bf1263 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:49.748512336 +0000 UTC m=+139.837061727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls") pod "cluster-samples-operator-667775844f-84kxm" (UID: "93cf13c9-c280-49b0-9865-5917d5bf1263") : secret "samples-operator-tls" not found Apr 16 18:17:49.256916 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.256891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wr5l\" (UniqueName: \"kubernetes.io/projected/93cf13c9-c280-49b0-9865-5917d5bf1263-kube-api-access-4wr5l\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.752164 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:49.752125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:49.752323 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:49.752268 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:17:49.752370 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:49.752346 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls podName:93cf13c9-c280-49b0-9865-5917d5bf1263 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.752324727 +0000 UTC m=+140.840874140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls") pod "cluster-samples-operator-667775844f-84kxm" (UID: "93cf13c9-c280-49b0-9865-5917d5bf1263") : secret "samples-operator-tls" not found Apr 16 18:17:50.759937 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:50.759902 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:50.760307 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:50.760016 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:17:50.760307 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:50.760100 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls podName:93cf13c9-c280-49b0-9865-5917d5bf1263 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.760064115 +0000 UTC m=+142.848613515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls") pod "cluster-samples-operator-667775844f-84kxm" (UID: "93cf13c9-c280-49b0-9865-5917d5bf1263") : secret "samples-operator-tls" not found Apr 16 18:17:51.977326 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:51.977300 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-82pct_9125009d-485e-443b-85bb-384f2afc6de2/dns-node-resolver/0.log" Apr 16 18:17:52.773801 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:52.773760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:52.773966 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:52.773917 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:17:52.774006 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:52.773980 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls podName:93cf13c9-c280-49b0-9865-5917d5bf1263 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:56.77396448 +0000 UTC m=+146.862513884 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls") pod "cluster-samples-operator-667775844f-84kxm" (UID: "93cf13c9-c280-49b0-9865-5917d5bf1263") : secret "samples-operator-tls" not found Apr 16 18:17:52.776473 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:52.776455 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8694n_44520956-97f4-441e-acae-e1c5b82de2ea/node-ca/0.log" Apr 16 18:17:54.051243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.051202 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7"] Apr 16 18:17:54.054716 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.054687 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.057329 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.057303 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:17:54.057329 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.057320 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:17:54.057512 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.057327 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:54.057684 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.057670 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:54.058647 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.058630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5rfzv\"" Apr 16 18:17:54.062548 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.062528 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7"] Apr 16 18:17:54.183825 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.183803 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np775\" (UniqueName: \"kubernetes.io/projected/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-kube-api-access-np775\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.183950 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.183834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.183950 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.183893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.284555 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.284523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np775\" (UniqueName: \"kubernetes.io/projected/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-kube-api-access-np775\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.284687 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.284566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.284687 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.284615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.285110 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.285070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.286805 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.286785 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.293779 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.293761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np775\" (UniqueName: \"kubernetes.io/projected/fd04a1c3-675f-42ec-b892-a37ac5e7f02c-kube-api-access-np775\") pod \"kube-storage-version-migrator-operator-756bb7d76f-crpf7\" (UID: \"fd04a1c3-675f-42ec-b892-a37ac5e7f02c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.364123 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.364105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" Apr 16 18:17:54.476595 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.476565 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7"] Apr 16 18:17:54.479296 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:17:54.479271 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd04a1c3_675f_42ec_b892_a37ac5e7f02c.slice/crio-30bc3e6357046f31a3f260873b9ac8de0f766d0f16d2d992bafebf1006c93e6a WatchSource:0}: Error finding container 30bc3e6357046f31a3f260873b9ac8de0f766d0f16d2d992bafebf1006c93e6a: Status 404 returned error can't find the container with id 30bc3e6357046f31a3f260873b9ac8de0f766d0f16d2d992bafebf1006c93e6a Apr 16 18:17:54.938806 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:54.938774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" event={"ID":"fd04a1c3-675f-42ec-b892-a37ac5e7f02c","Type":"ContainerStarted","Data":"30bc3e6357046f31a3f260873b9ac8de0f766d0f16d2d992bafebf1006c93e6a"} Apr 16 18:17:56.803422 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:56.803385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:17:56.803798 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:56.803544 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:17:56.803798 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:17:56.803612 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls podName:93cf13c9-c280-49b0-9865-5917d5bf1263 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:04.803595329 +0000 UTC m=+154.892144721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls") pod "cluster-samples-operator-667775844f-84kxm" (UID: "93cf13c9-c280-49b0-9865-5917d5bf1263") : secret "samples-operator-tls" not found Apr 16 18:17:56.943983 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:56.943901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" event={"ID":"fd04a1c3-675f-42ec-b892-a37ac5e7f02c","Type":"ContainerStarted","Data":"d5bbbe909d90b97cf9c167076a689d040c70008496c7dd799a5078a3d23b09ee"} Apr 16 18:17:56.963589 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:56.963546 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" podStartSLOduration=0.809918039 podStartE2EDuration="2.963532367s" podCreationTimestamp="2026-04-16 18:17:54 +0000 UTC" firstStartedPulling="2026-04-16 18:17:54.480974907 +0000 UTC m=+144.569524298" lastFinishedPulling="2026-04-16 18:17:56.634589231 +0000 UTC m=+146.723138626" observedRunningTime="2026-04-16 18:17:56.961872083 +0000 UTC m=+147.050421497" watchObservedRunningTime="2026-04-16 18:17:56.963532367 +0000 UTC m=+147.052081780" Apr 16 18:17:57.415462 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.415431 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp"] Apr 16 18:17:57.417397 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.417381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" Apr 16 18:17:57.420031 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.420011 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-wxgjd\"" Apr 16 18:17:57.430253 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.430227 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp"] Apr 16 18:17:57.610025 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.609985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxm9z\" (UniqueName: \"kubernetes.io/projected/ae307bf6-3540-4f0e-a6ff-9d14408fe2bb-kube-api-access-cxm9z\") pod \"network-check-source-7b678d77c7-jtdhp\" (UID: \"ae307bf6-3540-4f0e-a6ff-9d14408fe2bb\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" Apr 16 18:17:57.710591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.710502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxm9z\" (UniqueName: \"kubernetes.io/projected/ae307bf6-3540-4f0e-a6ff-9d14408fe2bb-kube-api-access-cxm9z\") pod \"network-check-source-7b678d77c7-jtdhp\" (UID: \"ae307bf6-3540-4f0e-a6ff-9d14408fe2bb\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" Apr 16 18:17:57.721703 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.721669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxm9z\" (UniqueName: \"kubernetes.io/projected/ae307bf6-3540-4f0e-a6ff-9d14408fe2bb-kube-api-access-cxm9z\") pod \"network-check-source-7b678d77c7-jtdhp\" (UID: \"ae307bf6-3540-4f0e-a6ff-9d14408fe2bb\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" Apr 16 18:17:57.725424 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.725386 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" Apr 16 18:17:57.845476 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.845271 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp"] Apr 16 18:17:57.850256 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:17:57.850215 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae307bf6_3540_4f0e_a6ff_9d14408fe2bb.slice/crio-594ebd58d57f53420a96869abfcc8aacb578877b31973ab64c6c96b9363e27ca WatchSource:0}: Error finding container 594ebd58d57f53420a96869abfcc8aacb578877b31973ab64c6c96b9363e27ca: Status 404 returned error can't find the container with id 594ebd58d57f53420a96869abfcc8aacb578877b31973ab64c6c96b9363e27ca Apr 16 18:17:57.947863 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.947825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" event={"ID":"ae307bf6-3540-4f0e-a6ff-9d14408fe2bb","Type":"ContainerStarted","Data":"cb2245667279a6a7656638a892659803c7bdba066ecb125390d5605876b0cd1b"} Apr 16 18:17:57.947863 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.947864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" event={"ID":"ae307bf6-3540-4f0e-a6ff-9d14408fe2bb","Type":"ContainerStarted","Data":"594ebd58d57f53420a96869abfcc8aacb578877b31973ab64c6c96b9363e27ca"} Apr 16 18:17:57.999471 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:57.999353 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jtdhp" podStartSLOduration=0.999331393 podStartE2EDuration="999.331393ms" podCreationTimestamp="2026-04-16 18:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:57.997889589 +0000 UTC m=+148.086439002" watchObservedRunningTime="2026-04-16 18:17:57.999331393 +0000 UTC m=+148.087880807" Apr 16 18:17:58.019474 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.019444 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx"] Apr 16 18:17:58.022200 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.022183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" Apr 16 18:17:58.025109 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.025068 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:58.025206 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.025065 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:17:58.025206 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.025131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7r668\"" Apr 16 18:17:58.040520 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.040497 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx"] Apr 16 18:17:58.214982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.214940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cq5h\" (UniqueName: \"kubernetes.io/projected/4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad-kube-api-access-7cq5h\") pod \"migrator-64d4d94569-zp6tx\" (UID: \"4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" Apr 16 18:17:58.315395 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.315365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cq5h\" (UniqueName: \"kubernetes.io/projected/4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad-kube-api-access-7cq5h\") pod \"migrator-64d4d94569-zp6tx\" (UID: \"4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" Apr 16 18:17:58.323806 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.323786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cq5h\" (UniqueName: \"kubernetes.io/projected/4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad-kube-api-access-7cq5h\") pod \"migrator-64d4d94569-zp6tx\" (UID: \"4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" Apr 16 18:17:58.331652 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.331631 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" Apr 16 18:17:58.449889 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.449858 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx"] Apr 16 18:17:58.454201 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:17:58.454176 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d0a9f4e_bdd1_416b_8dc9_28df41dd54ad.slice/crio-1422a0e25680eeb5c4de6728045c9b2363b3504321c73f34407e2b2d89e58e06 WatchSource:0}: Error finding container 1422a0e25680eeb5c4de6728045c9b2363b3504321c73f34407e2b2d89e58e06: Status 404 returned error can't find the container with id 1422a0e25680eeb5c4de6728045c9b2363b3504321c73f34407e2b2d89e58e06 Apr 16 18:17:58.952756 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:58.952716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" event={"ID":"4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad","Type":"ContainerStarted","Data":"1422a0e25680eeb5c4de6728045c9b2363b3504321c73f34407e2b2d89e58e06"} Apr 16 18:17:59.956908 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:59.956877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" event={"ID":"4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad","Type":"ContainerStarted","Data":"d8cbaed37e9772e9d34ae7699869f22e6d559ee53e06f4dee4d7d55b7abaf44b"} Apr 16 18:17:59.956908 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:59.956910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" event={"ID":"4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad","Type":"ContainerStarted","Data":"a81eb13b5bc6b7f9c71a9308c299523b73ab4140def5bddf5cd40bdfa0c8da02"} Apr 16 18:17:59.974656 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:17:59.974614 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-zp6tx" podStartSLOduration=2.001823608 podStartE2EDuration="2.974601629s" podCreationTimestamp="2026-04-16 18:17:57 +0000 UTC" firstStartedPulling="2026-04-16 18:17:58.456038625 +0000 UTC m=+148.544588015" lastFinishedPulling="2026-04-16 18:17:59.428816645 +0000 UTC m=+149.517366036" observedRunningTime="2026-04-16 18:17:59.973885079 +0000 UTC m=+150.062434506" watchObservedRunningTime="2026-04-16 18:17:59.974601629 +0000 UTC m=+150.063151060" Apr 16 18:18:04.865451 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:04.865416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:18:04.865847 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:18:04.865553 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:18:04.865847 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:18:04.865619 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls podName:93cf13c9-c280-49b0-9865-5917d5bf1263 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:20.865604739 +0000 UTC m=+170.954154130 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls") pod "cluster-samples-operator-667775844f-84kxm" (UID: "93cf13c9-c280-49b0-9865-5917d5bf1263") : secret "samples-operator-tls" not found Apr 16 18:18:06.287625 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:18:06.287583 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xbf7w" podUID="d9480c7a-7ea2-4833-a0ed-10e03e4bc66d" Apr 16 18:18:06.297547 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:18:06.297518 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4rlf8" podUID="61e9f3e6-a9f4-46cd-b1c0-cbdc48158979" Apr 16 18:18:06.975690 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:06.975661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xbf7w" Apr 16 18:18:07.514302 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:18:07.514269 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7rqfn" podUID="1c559edc-905d-4a66-b3b6-e4767670d083" Apr 16 18:18:11.211800 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.211769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:18:11.211800 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.211804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:18:11.214047 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.214016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9480c7a-7ea2-4833-a0ed-10e03e4bc66d-metrics-tls\") pod \"dns-default-xbf7w\" (UID: \"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d\") " pod="openshift-dns/dns-default-xbf7w" Apr 16 18:18:11.214180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.214163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61e9f3e6-a9f4-46cd-b1c0-cbdc48158979-cert\") pod \"ingress-canary-4rlf8\" (UID: \"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979\") " pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:18:11.479034 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.478959 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fxg25\"" Apr 16 18:18:11.487210 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.487196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xbf7w" Apr 16 18:18:11.605107 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.605065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xbf7w"] Apr 16 18:18:11.607408 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:11.607383 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9480c7a_7ea2_4833_a0ed_10e03e4bc66d.slice/crio-1d1c1d82e52763c0cdde2b892a2ef448acc4e327d32b793ae4e04911c6144ff8 WatchSource:0}: Error finding container 1d1c1d82e52763c0cdde2b892a2ef448acc4e327d32b793ae4e04911c6144ff8: Status 404 returned error can't find the container with id 1d1c1d82e52763c0cdde2b892a2ef448acc4e327d32b793ae4e04911c6144ff8 Apr 16 18:18:11.988156 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:11.988121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xbf7w" event={"ID":"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d","Type":"ContainerStarted","Data":"1d1c1d82e52763c0cdde2b892a2ef448acc4e327d32b793ae4e04911c6144ff8"} Apr 16 18:18:12.993691 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:12.993657 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xbf7w" event={"ID":"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d","Type":"ContainerStarted","Data":"ece9ab55cd0cbc9bb958a0c107e726eb969324f6025d16cc207987ba4f7c50d2"} Apr 16 18:18:13.997786 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:13.997749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xbf7w" event={"ID":"d9480c7a-7ea2-4833-a0ed-10e03e4bc66d","Type":"ContainerStarted","Data":"23711b55f5aa32c3da23ca643a16cb48b501b2d1e94dad5305fdd98729a68432"} Apr 16 18:18:13.998195 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:13.997916 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xbf7w" Apr 16 18:18:14.021053 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:14.021001 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xbf7w" podStartSLOduration=129.781484903 podStartE2EDuration="2m11.020987615s" podCreationTimestamp="2026-04-16 18:16:03 +0000 UTC" firstStartedPulling="2026-04-16 18:18:11.609216105 +0000 UTC m=+161.697765496" lastFinishedPulling="2026-04-16 18:18:12.848718804 +0000 UTC m=+162.937268208" observedRunningTime="2026-04-16 18:18:14.019337127 +0000 UTC m=+164.107886550" watchObservedRunningTime="2026-04-16 18:18:14.020987615 +0000 UTC m=+164.109537027" Apr 16 18:18:18.500277 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:18.500193 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:18:18.500277 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:18.500198 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:18:18.503285 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:18.503265 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4scgb\"" Apr 16 18:18:18.511506 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:18.511483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4rlf8" Apr 16 18:18:18.620627 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:18.620602 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4rlf8"] Apr 16 18:18:18.623328 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:18.623299 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e9f3e6_a9f4_46cd_b1c0_cbdc48158979.slice/crio-772a4a34b3844005e5c75663954d7e3062187606a7b02d165db33b4b6cf1cac7 WatchSource:0}: Error finding container 772a4a34b3844005e5c75663954d7e3062187606a7b02d165db33b4b6cf1cac7: Status 404 returned error can't find the container with id 772a4a34b3844005e5c75663954d7e3062187606a7b02d165db33b4b6cf1cac7 Apr 16 18:18:19.012547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:19.012508 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4rlf8" event={"ID":"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979","Type":"ContainerStarted","Data":"772a4a34b3844005e5c75663954d7e3062187606a7b02d165db33b4b6cf1cac7"} Apr 16 18:18:20.885373 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:20.885342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:18:20.887635 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:20.887616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/93cf13c9-c280-49b0-9865-5917d5bf1263-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-84kxm\" (UID: \"93cf13c9-c280-49b0-9865-5917d5bf1263\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:18:21.021698 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.021661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4rlf8" event={"ID":"61e9f3e6-a9f4-46cd-b1c0-cbdc48158979","Type":"ContainerStarted","Data":"013800eb2687a8bf4f30f09ca96c64aa4217ed27382e53d7449cdc0e3e324bd5"} Apr 16 18:18:21.037037 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.036993 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4rlf8" podStartSLOduration=136.526404235 podStartE2EDuration="2m18.036979851s" podCreationTimestamp="2026-04-16 18:16:03 +0000 UTC" firstStartedPulling="2026-04-16 18:18:18.625139311 +0000 UTC m=+168.713688703" lastFinishedPulling="2026-04-16 18:18:20.135714914 +0000 UTC m=+170.224264319" observedRunningTime="2026-04-16 18:18:21.036465504 +0000 UTC m=+171.125014916" watchObservedRunningTime="2026-04-16 18:18:21.036979851 +0000 UTC m=+171.125529285" Apr 16 18:18:21.164234 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.164167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" Apr 16 18:18:21.275246 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.275209 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm"] Apr 16 18:18:21.535402 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.535330 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mkfg8"] Apr 16 18:18:21.538317 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.538301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.541180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.541159 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:18:21.541852 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.541825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:18:21.541956 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.541912 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:18:21.541998 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.541963 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:18:21.542050 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.542036 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-czcrj\"" Apr 16 18:18:21.554778 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.554759 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mkfg8"] Apr 16 18:18:21.590066 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.590041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.590172 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.590106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-data-volume\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.590223 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.590186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-crio-socket\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.590265 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.590229 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.590304 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.590262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgc5d\" (UniqueName: \"kubernetes.io/projected/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-kube-api-access-sgc5d\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.625696 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.625669 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv"] Apr 16 18:18:21.628491 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.628477 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:21.631673 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.631658 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:18:21.632110 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.632072 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qq7g4\"" Apr 16 18:18:21.638417 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.638398 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cfdd9f895-79lv8"] Apr 16 18:18:21.641178 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.641161 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv"] Apr 16 18:18:21.641284 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.641266 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.643955 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.643930 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-846cr\"" Apr 16 18:18:21.643955 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.643946 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:18:21.644141 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.643953 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:18:21.644141 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.644016 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:18:21.648718 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.648701 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:18:21.662818 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.662798 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cfdd9f895-79lv8"] Apr 16 18:18:21.690559 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26a76190-c321-4d84-ab5e-db9513250bed-image-registry-private-configuration\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.690713 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-registry-tls\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.690713 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-bound-sa-token\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.690713 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgc5d\" (UniqueName: \"kubernetes.io/projected/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-kube-api-access-sgc5d\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.690713 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4vsz\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-kube-api-access-r4vsz\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.690883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5cd34ba5-a09f-4513-a54c-72a781092903-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-bbjsv\" (UID: \"5cd34ba5-a09f-4513-a54c-72a781092903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:21.690883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-data-volume\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.690883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-crio-socket\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.690883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.690883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26a76190-c321-4d84-ab5e-db9513250bed-trusted-ca\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.690883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26a76190-c321-4d84-ab5e-db9513250bed-installation-pull-secrets\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.691147 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.691147 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26a76190-c321-4d84-ab5e-db9513250bed-registry-certificates\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.691147 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26a76190-c321-4d84-ab5e-db9513250bed-ca-trust-extracted\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.691147 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.690969 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-crio-socket\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.691147 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.691123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-data-volume\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.691417 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.691398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.693156 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.693135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.705022 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.705003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgc5d\" (UniqueName: \"kubernetes.io/projected/a4e48994-8aa5-499b-9d41-ca0fd8b95a04-kube-api-access-sgc5d\") pod \"insights-runtime-extractor-mkfg8\" (UID: \"a4e48994-8aa5-499b-9d41-ca0fd8b95a04\") " pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.792131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-registry-tls\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-bound-sa-token\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4vsz\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-kube-api-access-r4vsz\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5cd34ba5-a09f-4513-a54c-72a781092903-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-bbjsv\" (UID: \"5cd34ba5-a09f-4513-a54c-72a781092903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:21.792444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26a76190-c321-4d84-ab5e-db9513250bed-trusted-ca\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26a76190-c321-4d84-ab5e-db9513250bed-installation-pull-secrets\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26a76190-c321-4d84-ab5e-db9513250bed-registry-certificates\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26a76190-c321-4d84-ab5e-db9513250bed-ca-trust-extracted\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26a76190-c321-4d84-ab5e-db9513250bed-image-registry-private-configuration\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.792869 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.792839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/26a76190-c321-4d84-ab5e-db9513250bed-ca-trust-extracted\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.793192 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.793168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/26a76190-c321-4d84-ab5e-db9513250bed-registry-certificates\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.793322 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.793292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/26a76190-c321-4d84-ab5e-db9513250bed-trusted-ca\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.795161 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.795091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-registry-tls\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.795161 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.795097 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/26a76190-c321-4d84-ab5e-db9513250bed-image-registry-private-configuration\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.795290 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.795222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5cd34ba5-a09f-4513-a54c-72a781092903-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-bbjsv\" (UID: \"5cd34ba5-a09f-4513-a54c-72a781092903\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:21.795328 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.795282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/26a76190-c321-4d84-ab5e-db9513250bed-installation-pull-secrets\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.800567 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.800538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-bound-sa-token\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.800889 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.800872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4vsz\" (UniqueName: \"kubernetes.io/projected/26a76190-c321-4d84-ab5e-db9513250bed-kube-api-access-r4vsz\") pod \"image-registry-5cfdd9f895-79lv8\" (UID: \"26a76190-c321-4d84-ab5e-db9513250bed\") " pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.847045 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.847019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mkfg8" Apr 16 18:18:21.936707 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.936680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:21.950923 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.950537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:21.987934 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:21.987782 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mkfg8"] Apr 16 18:18:21.993664 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:21.993035 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4e48994_8aa5_499b_9d41_ca0fd8b95a04.slice/crio-4d80f5058fb198882fc5ffce5fbe8076f4073a69b697d98b5b7c6906d51092e8 WatchSource:0}: Error finding container 4d80f5058fb198882fc5ffce5fbe8076f4073a69b697d98b5b7c6906d51092e8: Status 404 returned error can't find the container with id 4d80f5058fb198882fc5ffce5fbe8076f4073a69b697d98b5b7c6906d51092e8 Apr 16 18:18:22.027006 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:22.026930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mkfg8" event={"ID":"a4e48994-8aa5-499b-9d41-ca0fd8b95a04","Type":"ContainerStarted","Data":"4d80f5058fb198882fc5ffce5fbe8076f4073a69b697d98b5b7c6906d51092e8"} Apr 16 18:18:22.029049 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:22.029012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" event={"ID":"93cf13c9-c280-49b0-9865-5917d5bf1263","Type":"ContainerStarted","Data":"50b9f941f7255b0fe95c61b0ba6c5299cd925a52858ace0a791c46b87c7372e4"} Apr 16 18:18:22.087688 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:22.087539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv"] Apr 16 18:18:22.091159 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:22.091132 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd34ba5_a09f_4513_a54c_72a781092903.slice/crio-d69f47787a1b42b43af19df0c6b322d7a62286259fcd51209100a398459e895c WatchSource:0}: Error finding container d69f47787a1b42b43af19df0c6b322d7a62286259fcd51209100a398459e895c: Status 404 returned error can't find the container with id d69f47787a1b42b43af19df0c6b322d7a62286259fcd51209100a398459e895c Apr 16 18:18:22.111652 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:22.111609 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cfdd9f895-79lv8"] Apr 16 18:18:22.116615 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:22.116589 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26a76190_c321_4d84_ab5e_db9513250bed.slice/crio-c7af4f1bb28cc7e26911d523f48ec24ce8f09baa0c7e415259966c08f7540104 WatchSource:0}: Error finding container c7af4f1bb28cc7e26911d523f48ec24ce8f09baa0c7e415259966c08f7540104: Status 404 returned error can't find the container with id c7af4f1bb28cc7e26911d523f48ec24ce8f09baa0c7e415259966c08f7540104 Apr 16 18:18:23.032341 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:23.032307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mkfg8" event={"ID":"a4e48994-8aa5-499b-9d41-ca0fd8b95a04","Type":"ContainerStarted","Data":"a2facad03cba4e5c949101b98dc0d14cc492835037fbb0fb44bea595ca244cf6"} Apr 16 18:18:23.033341 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:23.033317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" event={"ID":"5cd34ba5-a09f-4513-a54c-72a781092903","Type":"ContainerStarted","Data":"d69f47787a1b42b43af19df0c6b322d7a62286259fcd51209100a398459e895c"} Apr 16 18:18:23.034366 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:23.034348 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" event={"ID":"26a76190-c321-4d84-ab5e-db9513250bed","Type":"ContainerStarted","Data":"1ec0b02f50181a6e18a11fcfc1f8a874d3e9175f545f6fbf57d241cfd646c2a2"} Apr 16 18:18:23.034449 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:23.034369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" event={"ID":"26a76190-c321-4d84-ab5e-db9513250bed","Type":"ContainerStarted","Data":"c7af4f1bb28cc7e26911d523f48ec24ce8f09baa0c7e415259966c08f7540104"} Apr 16 18:18:23.034493 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:23.034483 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:23.055827 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:23.055786 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" podStartSLOduration=2.055771862 podStartE2EDuration="2.055771862s" podCreationTimestamp="2026-04-16 18:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:18:23.054470557 +0000 UTC m=+173.143019969" watchObservedRunningTime="2026-04-16 18:18:23.055771862 +0000 UTC m=+173.144321265" Apr 16 18:18:24.003400 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.003367 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xbf7w" Apr 16 18:18:24.040306 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.040261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mkfg8" event={"ID":"a4e48994-8aa5-499b-9d41-ca0fd8b95a04","Type":"ContainerStarted","Data":"bb47bf58d511eef439d10e139024a23dbaa569c81c87e9225361e407e17e2dc3"} Apr 16 18:18:24.043466 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.043436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" event={"ID":"93cf13c9-c280-49b0-9865-5917d5bf1263","Type":"ContainerStarted","Data":"f0a25ada08310a72ea48961eb8bcc54b83afa4d2f0abb3e73503d6e75cf1acdb"} Apr 16 18:18:24.043591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.043471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" event={"ID":"93cf13c9-c280-49b0-9865-5917d5bf1263","Type":"ContainerStarted","Data":"3c689084a761484c214e1f6025b059e702d34288f4aab1610e33bf0eaebc3a2b"} Apr 16 18:18:24.045437 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.045402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" event={"ID":"5cd34ba5-a09f-4513-a54c-72a781092903","Type":"ContainerStarted","Data":"6bf77bda9ed0f6621159fc9662e73fb714d1343063de9a23625b881c883a95a9"} Apr 16 18:18:24.045690 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.045667 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:24.051220 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.051199 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" Apr 16 18:18:24.060292 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.060247 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-84kxm" podStartSLOduration=33.052171007 podStartE2EDuration="35.06023188s" podCreationTimestamp="2026-04-16 18:17:49 +0000 UTC" firstStartedPulling="2026-04-16 18:18:21.314594698 +0000 UTC m=+171.403144090" lastFinishedPulling="2026-04-16 18:18:23.322655565 +0000 UTC m=+173.411204963" observedRunningTime="2026-04-16 18:18:24.059764751 +0000 UTC m=+174.148314165" watchObservedRunningTime="2026-04-16 18:18:24.06023188 +0000 UTC m=+174.148781294" Apr 16 18:18:24.074398 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:24.074359 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bbjsv" podStartSLOduration=1.843395975 podStartE2EDuration="3.074345125s" podCreationTimestamp="2026-04-16 18:18:21 +0000 UTC" firstStartedPulling="2026-04-16 18:18:22.093544833 +0000 UTC m=+172.182094239" lastFinishedPulling="2026-04-16 18:18:23.324493993 +0000 UTC m=+173.413043389" observedRunningTime="2026-04-16 18:18:24.074228917 +0000 UTC m=+174.162778328" watchObservedRunningTime="2026-04-16 18:18:24.074345125 +0000 UTC m=+174.162894539" Apr 16 18:18:25.049045 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:25.048958 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mkfg8" event={"ID":"a4e48994-8aa5-499b-9d41-ca0fd8b95a04","Type":"ContainerStarted","Data":"b7543f7bbca9cf1dd161e031c29b0f5dca2def6780ec35f783c43aac53be4ead"} Apr 16 18:18:25.069994 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:25.069949 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mkfg8" podStartSLOduration=1.456018965 podStartE2EDuration="4.069935427s" podCreationTimestamp="2026-04-16 18:18:21 +0000 UTC" firstStartedPulling="2026-04-16 18:18:22.084667962 +0000 UTC m=+172.173217357" lastFinishedPulling="2026-04-16 18:18:24.698584417 +0000 UTC m=+174.787133819" observedRunningTime="2026-04-16 18:18:25.068093956 +0000 UTC m=+175.156643363" watchObservedRunningTime="2026-04-16 18:18:25.069935427 +0000 UTC m=+175.158484840" Apr 16 18:18:29.181897 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.181863 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7z8sx"] Apr 16 18:18:29.187143 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.187122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.190145 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.190116 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:18:29.190145 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.190125 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:18:29.190335 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.190170 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:18:29.191240 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.191216 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g6plr\"" Apr 16 18:18:29.191346 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.191260 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:18:29.191408 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.191225 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:18:29.191469 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.191225 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:18:29.249129 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-tls\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-textfile\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-sys\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249218 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-root\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/edf117dd-4ff2-489b-b752-a7843cb792f3-metrics-client-ca\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249510 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rjdl\" (UniqueName: \"kubernetes.io/projected/edf117dd-4ff2-489b-b752-a7843cb792f3-kube-api-access-9rjdl\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.249510 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.249375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-wtmp\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.349819 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.349787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.349970 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.349826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-tls\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.349970 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.349852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-textfile\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350064 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.349965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-sys\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350064 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-root\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350064 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350065 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-sys\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/edf117dd-4ff2-489b-b752-a7843cb792f3-metrics-client-ca\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350114 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-root\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350146 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-textfile\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rjdl\" (UniqueName: \"kubernetes.io/projected/edf117dd-4ff2-489b-b752-a7843cb792f3-kube-api-access-9rjdl\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350487 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350253 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-wtmp\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350487 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-wtmp\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350655 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/edf117dd-4ff2-489b-b752-a7843cb792f3-metrics-client-ca\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.350717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.350674 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.352180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.352159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-tls\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.352273 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.352232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/edf117dd-4ff2-489b-b752-a7843cb792f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.359036 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.359014 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rjdl\" (UniqueName: \"kubernetes.io/projected/edf117dd-4ff2-489b-b752-a7843cb792f3-kube-api-access-9rjdl\") pod \"node-exporter-7z8sx\" (UID: \"edf117dd-4ff2-489b-b752-a7843cb792f3\") " pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.496372 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:29.496301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7z8sx" Apr 16 18:18:29.504173 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:29.504151 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf117dd_4ff2_489b_b752_a7843cb792f3.slice/crio-ded5523c56d826e1afe8d55e1e883401b6d45afc0ceb6df17f46527101ba7e19 WatchSource:0}: Error finding container ded5523c56d826e1afe8d55e1e883401b6d45afc0ceb6df17f46527101ba7e19: Status 404 returned error can't find the container with id ded5523c56d826e1afe8d55e1e883401b6d45afc0ceb6df17f46527101ba7e19 Apr 16 18:18:30.062509 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:30.062473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7z8sx" event={"ID":"edf117dd-4ff2-489b-b752-a7843cb792f3","Type":"ContainerStarted","Data":"ded5523c56d826e1afe8d55e1e883401b6d45afc0ceb6df17f46527101ba7e19"} Apr 16 18:18:31.066016 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:31.065981 2573 generic.go:358] "Generic (PLEG): container finished" podID="edf117dd-4ff2-489b-b752-a7843cb792f3" containerID="1865dfd05e0f618426de6b1e87d9f39ba94156ba1734a26fb8591b3ded68c80d" exitCode=0 Apr 16 18:18:31.066406 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:31.066046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7z8sx" event={"ID":"edf117dd-4ff2-489b-b752-a7843cb792f3","Type":"ContainerDied","Data":"1865dfd05e0f618426de6b1e87d9f39ba94156ba1734a26fb8591b3ded68c80d"} Apr 16 18:18:32.073065 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:32.073030 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7z8sx" event={"ID":"edf117dd-4ff2-489b-b752-a7843cb792f3","Type":"ContainerStarted","Data":"5f00da37d0017585c0925b72160373396ed5788c0cd0ec2154fe145201423792"} Apr 16 18:18:32.073512 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:32.073096 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7z8sx" event={"ID":"edf117dd-4ff2-489b-b752-a7843cb792f3","Type":"ContainerStarted","Data":"4adea459278e4d187b66bd20b34aec667e0333feeee8b12ba1179b5da4c629ff"} Apr 16 18:18:32.092923 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:32.092880 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7z8sx" podStartSLOduration=1.9002275800000001 podStartE2EDuration="3.092866377s" podCreationTimestamp="2026-04-16 18:18:29 +0000 UTC" firstStartedPulling="2026-04-16 18:18:29.505637473 +0000 UTC m=+179.594186864" lastFinishedPulling="2026-04-16 18:18:30.698276265 +0000 UTC m=+180.786825661" observedRunningTime="2026-04-16 18:18:32.091834699 +0000 UTC m=+182.180384148" watchObservedRunningTime="2026-04-16 18:18:32.092866377 +0000 UTC m=+182.181415790" Apr 16 18:18:33.864327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:33.864293 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf"] Apr 16 18:18:33.867432 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:33.867416 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:33.869825 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:33.869804 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:18:33.869939 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:33.869901 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-2xs8q\"" Apr 16 18:18:33.875451 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:33.875431 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf"] Apr 16 18:18:33.985351 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:33.985320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e1a285c0-f532-4ac2-821c-860f34c577d3-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92cxf\" (UID: \"e1a285c0-f532-4ac2-821c-860f34c577d3\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:34.086533 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:34.086504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e1a285c0-f532-4ac2-821c-860f34c577d3-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92cxf\" (UID: \"e1a285c0-f532-4ac2-821c-860f34c577d3\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:34.088906 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:34.088881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e1a285c0-f532-4ac2-821c-860f34c577d3-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92cxf\" (UID: \"e1a285c0-f532-4ac2-821c-860f34c577d3\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:34.176539 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:34.176481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:34.293225 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:34.293199 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf"] Apr 16 18:18:34.295867 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:34.295835 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a285c0_f532_4ac2_821c_860f34c577d3.slice/crio-340431a0ec28613ad972d1ec4581de882e7a6b6f5f6a93545c6dd19eb48f026b WatchSource:0}: Error finding container 340431a0ec28613ad972d1ec4581de882e7a6b6f5f6a93545c6dd19eb48f026b: Status 404 returned error can't find the container with id 340431a0ec28613ad972d1ec4581de882e7a6b6f5f6a93545c6dd19eb48f026b Apr 16 18:18:35.082979 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.082943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" event={"ID":"e1a285c0-f532-4ac2-821c-860f34c577d3","Type":"ContainerStarted","Data":"340431a0ec28613ad972d1ec4581de882e7a6b6f5f6a93545c6dd19eb48f026b"} Apr 16 18:18:35.532969 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.532947 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-sqmkv"] Apr 16 18:18:35.536932 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.536911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:35.543376 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.543354 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:18:35.543487 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.543471 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:18:35.545308 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.545294 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-r97p7\"" Apr 16 18:18:35.551324 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.551306 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-sqmkv"] Apr 16 18:18:35.598059 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.598037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdjs\" (UniqueName: \"kubernetes.io/projected/1f2e7243-0f55-4aaf-9db9-f670dd987553-kube-api-access-2sdjs\") pod \"downloads-586b57c7b4-sqmkv\" (UID: \"1f2e7243-0f55-4aaf-9db9-f670dd987553\") " pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:35.698664 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.698623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sdjs\" (UniqueName: \"kubernetes.io/projected/1f2e7243-0f55-4aaf-9db9-f670dd987553-kube-api-access-2sdjs\") pod \"downloads-586b57c7b4-sqmkv\" (UID: \"1f2e7243-0f55-4aaf-9db9-f670dd987553\") " pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:35.709518 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.709491 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sdjs\" (UniqueName: \"kubernetes.io/projected/1f2e7243-0f55-4aaf-9db9-f670dd987553-kube-api-access-2sdjs\") pod \"downloads-586b57c7b4-sqmkv\" (UID: \"1f2e7243-0f55-4aaf-9db9-f670dd987553\") " pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:35.864218 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.864191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:35.978772 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:35.978745 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-sqmkv"] Apr 16 18:18:35.982043 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:35.982011 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2e7243_0f55_4aaf_9db9_f670dd987553.slice/crio-5ed400882e08c8259a036412820408a0421c687f12f82460b54618644636762d WatchSource:0}: Error finding container 5ed400882e08c8259a036412820408a0421c687f12f82460b54618644636762d: Status 404 returned error can't find the container with id 5ed400882e08c8259a036412820408a0421c687f12f82460b54618644636762d Apr 16 18:18:36.086288 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:36.086260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" event={"ID":"e1a285c0-f532-4ac2-821c-860f34c577d3","Type":"ContainerStarted","Data":"9dfb1e4b2c1fb041426c78485881fb9028921c7664add0c561d96929f3d3cc41"} Apr 16 18:18:36.086653 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:36.086439 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:36.087394 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:36.087369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-sqmkv" event={"ID":"1f2e7243-0f55-4aaf-9db9-f670dd987553","Type":"ContainerStarted","Data":"5ed400882e08c8259a036412820408a0421c687f12f82460b54618644636762d"} Apr 16 18:18:36.091225 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:36.091190 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" Apr 16 18:18:36.103658 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:36.103621 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92cxf" podStartSLOduration=1.883243301 podStartE2EDuration="3.103608692s" podCreationTimestamp="2026-04-16 18:18:33 +0000 UTC" firstStartedPulling="2026-04-16 18:18:34.297680326 +0000 UTC m=+184.386229717" lastFinishedPulling="2026-04-16 18:18:35.518045714 +0000 UTC m=+185.606595108" observedRunningTime="2026-04-16 18:18:36.103192066 +0000 UTC m=+186.191741479" watchObservedRunningTime="2026-04-16 18:18:36.103608692 +0000 UTC m=+186.192158105" Apr 16 18:18:44.051038 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:44.051007 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cfdd9f895-79lv8" Apr 16 18:18:45.492434 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.492404 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6df5c876d5-h5gwj"] Apr 16 18:18:45.496611 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.496586 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.499243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.499210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:18:45.499243 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.499220 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:18:45.500400 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.500277 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:18:45.500400 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.500288 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bkqlw\"" Apr 16 18:18:45.500400 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.500373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:18:45.500643 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.500327 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:18:45.506691 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.506671 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df5c876d5-h5gwj"] Apr 16 18:18:45.578580 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.578548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-config\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.578730 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.578589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-service-ca\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.578730 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.578651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw7r2\" (UniqueName: \"kubernetes.io/projected/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-kube-api-access-pw7r2\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.578730 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.578677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-oauth-serving-cert\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.578730 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.578697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-oauth-config\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.578730 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.578730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-serving-cert\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.679925 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.679884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-config\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680118 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.679949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-service-ca\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680118 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw7r2\" (UniqueName: \"kubernetes.io/projected/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-kube-api-access-pw7r2\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680118 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-oauth-serving-cert\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680118 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-oauth-config\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680329 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-serving-cert\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680758 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-config\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680865 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-service-ca\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.680922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.680888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-oauth-serving-cert\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.683008 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.682975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-oauth-config\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.683123 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.683042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-serving-cert\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.689360 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.689327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw7r2\" (UniqueName: \"kubernetes.io/projected/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-kube-api-access-pw7r2\") pod \"console-6df5c876d5-h5gwj\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:45.807974 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:45.807894 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:51.304748 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:51.304723 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df5c876d5-h5gwj"] Apr 16 18:18:51.312150 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:18:51.312116 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee920f1_f3d3_46b6_b62a_c8ecfe45174c.slice/crio-e5ab643b9057e5a0f5d40131b26c342aa7b3eb966186df2bfc0b63fc9542bed4 WatchSource:0}: Error finding container e5ab643b9057e5a0f5d40131b26c342aa7b3eb966186df2bfc0b63fc9542bed4: Status 404 returned error can't find the container with id e5ab643b9057e5a0f5d40131b26c342aa7b3eb966186df2bfc0b63fc9542bed4 Apr 16 18:18:52.132875 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:52.132834 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df5c876d5-h5gwj" event={"ID":"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c","Type":"ContainerStarted","Data":"e5ab643b9057e5a0f5d40131b26c342aa7b3eb966186df2bfc0b63fc9542bed4"} Apr 16 18:18:52.134796 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:52.134726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-sqmkv" event={"ID":"1f2e7243-0f55-4aaf-9db9-f670dd987553","Type":"ContainerStarted","Data":"490eeadc7c18e7fe063afb88765a7c81f5c775dadec7ce3105b47dadb05d1eaa"} Apr 16 18:18:52.135658 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:52.135172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:52.155510 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:52.155480 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-sqmkv" Apr 16 18:18:52.162598 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:52.162531 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-sqmkv" podStartSLOduration=1.867150778 podStartE2EDuration="17.162506765s" podCreationTimestamp="2026-04-16 18:18:35 +0000 UTC" firstStartedPulling="2026-04-16 18:18:35.98411159 +0000 UTC m=+186.072660985" lastFinishedPulling="2026-04-16 18:18:51.279467582 +0000 UTC m=+201.368016972" observedRunningTime="2026-04-16 18:18:52.15604501 +0000 UTC m=+202.244594424" watchObservedRunningTime="2026-04-16 18:18:52.162506765 +0000 UTC m=+202.251056179" Apr 16 18:18:55.147444 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:55.147405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df5c876d5-h5gwj" event={"ID":"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c","Type":"ContainerStarted","Data":"6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d"} Apr 16 18:18:55.165379 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:55.165330 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6df5c876d5-h5gwj" podStartSLOduration=6.680504558 podStartE2EDuration="10.165315643s" podCreationTimestamp="2026-04-16 18:18:45 +0000 UTC" firstStartedPulling="2026-04-16 18:18:51.314110406 +0000 UTC m=+201.402659797" lastFinishedPulling="2026-04-16 18:18:54.798921487 +0000 UTC m=+204.887470882" observedRunningTime="2026-04-16 18:18:55.164509773 +0000 UTC m=+205.253059181" watchObservedRunningTime="2026-04-16 18:18:55.165315643 +0000 UTC m=+205.253865056" Apr 16 18:18:55.808659 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:55.808620 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:55.808659 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:55.808668 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:55.814286 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:55.814258 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:18:56.154402 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:18:56.154321 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:19:00.764886 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:00.764853 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4rlf8_61e9f3e6-a9f4-46cd-b1c0-cbdc48158979/serve-healthcheck-canary/0.log" Apr 16 18:19:03.876937 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:03.876911 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df5c876d5-h5gwj"] Apr 16 18:19:08.186184 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:08.186153 2573 generic.go:358] "Generic (PLEG): container finished" podID="fd04a1c3-675f-42ec-b892-a37ac5e7f02c" containerID="d5bbbe909d90b97cf9c167076a689d040c70008496c7dd799a5078a3d23b09ee" exitCode=0 Apr 16 18:19:08.186184 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:08.186189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" event={"ID":"fd04a1c3-675f-42ec-b892-a37ac5e7f02c","Type":"ContainerDied","Data":"d5bbbe909d90b97cf9c167076a689d040c70008496c7dd799a5078a3d23b09ee"} Apr 16 18:19:08.186617 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:08.186458 2573 scope.go:117] "RemoveContainer" containerID="d5bbbe909d90b97cf9c167076a689d040c70008496c7dd799a5078a3d23b09ee" Apr 16 18:19:09.190530 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:09.190497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-crpf7" event={"ID":"fd04a1c3-675f-42ec-b892-a37ac5e7f02c","Type":"ContainerStarted","Data":"0d8a188f4b8bcb2ae9bfd067c0d791356313936378f2d1d76bf013b3f23b1237"} Apr 16 18:19:28.895660 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:28.895595 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6df5c876d5-h5gwj" podUID="4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" containerName="console" containerID="cri-o://6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d" gracePeriod=15 Apr 16 18:19:29.160756 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.160737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df5c876d5-h5gwj_4ee920f1-f3d3-46b6-b62a-c8ecfe45174c/console/0.log" Apr 16 18:19:29.160862 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.160797 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:19:29.233421 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233387 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw7r2\" (UniqueName: \"kubernetes.io/projected/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-kube-api-access-pw7r2\") pod \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " Apr 16 18:19:29.233591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233431 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-service-ca\") pod \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " Apr 16 18:19:29.233591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233463 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-oauth-serving-cert\") pod \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " Apr 16 18:19:29.233591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233490 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-config\") pod \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " Apr 16 18:19:29.233591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233532 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-serving-cert\") pod \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " Apr 16 18:19:29.233591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233565 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-oauth-config\") pod \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\" (UID: \"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c\") " Apr 16 18:19:29.233937 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233904 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-config" (OuterVolumeSpecName: "console-config") pod "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" (UID: "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:29.234053 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233918 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" (UID: "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:29.234053 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.233912 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-service-ca" (OuterVolumeSpecName: "service-ca") pod "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" (UID: "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:19:29.235686 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.235652 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-kube-api-access-pw7r2" (OuterVolumeSpecName: "kube-api-access-pw7r2") pod "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" (UID: "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c"). InnerVolumeSpecName "kube-api-access-pw7r2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:19:29.235686 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.235669 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" (UID: "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:29.235807 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.235714 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" (UID: "4ee920f1-f3d3-46b6-b62a-c8ecfe45174c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:19:29.244293 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.244277 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df5c876d5-h5gwj_4ee920f1-f3d3-46b6-b62a-c8ecfe45174c/console/0.log" Apr 16 18:19:29.244383 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.244308 2573 generic.go:358] "Generic (PLEG): container finished" podID="4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" containerID="6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d" exitCode=2 Apr 16 18:19:29.244383 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.244350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df5c876d5-h5gwj" event={"ID":"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c","Type":"ContainerDied","Data":"6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d"} Apr 16 18:19:29.244383 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.244371 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df5c876d5-h5gwj" event={"ID":"4ee920f1-f3d3-46b6-b62a-c8ecfe45174c","Type":"ContainerDied","Data":"e5ab643b9057e5a0f5d40131b26c342aa7b3eb966186df2bfc0b63fc9542bed4"} Apr 16 18:19:29.244503 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.244385 2573 scope.go:117] "RemoveContainer" containerID="6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d" Apr 16 18:19:29.244503 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.244383 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df5c876d5-h5gwj" Apr 16 18:19:29.251638 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.251622 2573 scope.go:117] "RemoveContainer" containerID="6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d" Apr 16 18:19:29.251927 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:19:29.251902 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d\": container with ID starting with 6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d not found: ID does not exist" containerID="6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d" Apr 16 18:19:29.251998 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.251934 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d"} err="failed to get container status \"6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d\": rpc error: code = NotFound desc = could not find container \"6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d\": container with ID starting with 6c97902af9ce90d582ee12ba772155a670d99589beef3ec712ba03e8de916e5d not found: ID does not exist" Apr 16 18:19:29.267745 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.267723 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df5c876d5-h5gwj"] Apr 16 18:19:29.271095 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.271060 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6df5c876d5-h5gwj"] Apr 16 18:19:29.334031 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.334005 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-serving-cert\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:19:29.334031 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.334027 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-oauth-config\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:19:29.334180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.334037 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pw7r2\" (UniqueName: \"kubernetes.io/projected/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-kube-api-access-pw7r2\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:19:29.334180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.334046 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-service-ca\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:19:29.334180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.334056 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-oauth-serving-cert\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:19:29.334180 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:29.334064 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c-console-config\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:19:30.504029 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:30.503988 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" path="/var/lib/kubelet/pods/4ee920f1-f3d3-46b6-b62a-c8ecfe45174c/volumes" Apr 16 18:19:42.232709 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:42.232673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:19:42.234995 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:42.234974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c559edc-905d-4a66-b3b6-e4767670d083-metrics-certs\") pod \"network-metrics-daemon-7rqfn\" (UID: \"1c559edc-905d-4a66-b3b6-e4767670d083\") " pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:19:42.504316 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:42.504236 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4plx6\"" Apr 16 18:19:42.511827 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:42.511807 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rqfn" Apr 16 18:19:42.625960 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:42.625929 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7rqfn"] Apr 16 18:19:42.628607 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:19:42.628570 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c559edc_905d_4a66_b3b6_e4767670d083.slice/crio-97f53781d2f66ff39fb9fbc018729ffe6ecef7010ba4f2ada816315feb8bba50 WatchSource:0}: Error finding container 97f53781d2f66ff39fb9fbc018729ffe6ecef7010ba4f2ada816315feb8bba50: Status 404 returned error can't find the container with id 97f53781d2f66ff39fb9fbc018729ffe6ecef7010ba4f2ada816315feb8bba50 Apr 16 18:19:43.286196 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:43.286158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rqfn" event={"ID":"1c559edc-905d-4a66-b3b6-e4767670d083","Type":"ContainerStarted","Data":"97f53781d2f66ff39fb9fbc018729ffe6ecef7010ba4f2ada816315feb8bba50"} Apr 16 18:19:44.289799 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:44.289762 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rqfn" event={"ID":"1c559edc-905d-4a66-b3b6-e4767670d083","Type":"ContainerStarted","Data":"cdd1f19b14c9d1b72a33ec4095b29a2c5376349c0650c0db5950b006c249f6b9"} Apr 16 18:19:44.289799 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:44.289798 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rqfn" event={"ID":"1c559edc-905d-4a66-b3b6-e4767670d083","Type":"ContainerStarted","Data":"85dae31f873473bad97f34996d4047286b27421fd53bad4966b224b09a0b9c1f"} Apr 16 18:19:44.309696 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:19:44.309653 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7rqfn" podStartSLOduration=253.356764646 podStartE2EDuration="4m14.309638208s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:19:42.630246945 +0000 UTC m=+252.718796335" lastFinishedPulling="2026-04-16 18:19:43.583120507 +0000 UTC m=+253.671669897" observedRunningTime="2026-04-16 18:19:44.307524426 +0000 UTC m=+254.396073840" watchObservedRunningTime="2026-04-16 18:19:44.309638208 +0000 UTC m=+254.398187620" Apr 16 18:20:30.371133 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:20:30.371104 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:20:30.371661 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:20:30.371431 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:20:30.374608 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:20:30.374584 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:22:16.148868 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.148830 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6"] Apr 16 18:22:16.149452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.149104 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" containerName="console" Apr 16 18:22:16.149452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.149115 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" containerName="console" Apr 16 18:22:16.149452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.149176 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ee920f1-f3d3-46b6-b62a-c8ecfe45174c" containerName="console" Apr 16 18:22:16.151254 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.151237 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.154026 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.154006 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 18:22:16.154524 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.154506 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:22:16.154755 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.154739 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-ctwqq\"" Apr 16 18:22:16.164486 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.164467 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6"] Apr 16 18:22:16.311605 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.311573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmx6\" (UniqueName: \"kubernetes.io/projected/cb1e0698-06aa-48a4-800c-2f54fecaa0ef-kube-api-access-cwmx6\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tc7r6\" (UID: \"cb1e0698-06aa-48a4-800c-2f54fecaa0ef\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.311605 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.311610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb1e0698-06aa-48a4-800c-2f54fecaa0ef-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tc7r6\" (UID: \"cb1e0698-06aa-48a4-800c-2f54fecaa0ef\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.412979 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.412905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmx6\" (UniqueName: \"kubernetes.io/projected/cb1e0698-06aa-48a4-800c-2f54fecaa0ef-kube-api-access-cwmx6\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tc7r6\" (UID: \"cb1e0698-06aa-48a4-800c-2f54fecaa0ef\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.412979 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.412943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb1e0698-06aa-48a4-800c-2f54fecaa0ef-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tc7r6\" (UID: \"cb1e0698-06aa-48a4-800c-2f54fecaa0ef\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.413269 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.413254 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb1e0698-06aa-48a4-800c-2f54fecaa0ef-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tc7r6\" (UID: \"cb1e0698-06aa-48a4-800c-2f54fecaa0ef\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.430483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.430456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmx6\" (UniqueName: \"kubernetes.io/projected/cb1e0698-06aa-48a4-800c-2f54fecaa0ef-kube-api-access-cwmx6\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tc7r6\" (UID: \"cb1e0698-06aa-48a4-800c-2f54fecaa0ef\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.460307 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.460286 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" Apr 16 18:22:16.579357 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.579197 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6"] Apr 16 18:22:16.582815 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:22:16.582779 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb1e0698_06aa_48a4_800c_2f54fecaa0ef.slice/crio-06091b1bf02d81ca908d8adf63d5ca4af3c3a9e60a6af63ca96c7f9264feb55e WatchSource:0}: Error finding container 06091b1bf02d81ca908d8adf63d5ca4af3c3a9e60a6af63ca96c7f9264feb55e: Status 404 returned error can't find the container with id 06091b1bf02d81ca908d8adf63d5ca4af3c3a9e60a6af63ca96c7f9264feb55e Apr 16 18:22:16.585137 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.585119 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:22:16.694968 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:16.691211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" event={"ID":"cb1e0698-06aa-48a4-800c-2f54fecaa0ef","Type":"ContainerStarted","Data":"06091b1bf02d81ca908d8adf63d5ca4af3c3a9e60a6af63ca96c7f9264feb55e"} Apr 16 18:22:19.702909 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:19.702873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" event={"ID":"cb1e0698-06aa-48a4-800c-2f54fecaa0ef","Type":"ContainerStarted","Data":"1f27ab5738167e8a39655fc6a14aaa526d33305d48879cfbb6f7e1e1111d87e6"} Apr 16 18:22:19.723587 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:19.723524 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tc7r6" podStartSLOduration=1.421678985 podStartE2EDuration="3.72350866s" podCreationTimestamp="2026-04-16 18:22:16 +0000 UTC" firstStartedPulling="2026-04-16 18:22:16.585300593 +0000 UTC m=+406.673849990" lastFinishedPulling="2026-04-16 18:22:18.887130271 +0000 UTC m=+408.975679665" observedRunningTime="2026-04-16 18:22:19.722461913 +0000 UTC m=+409.811011335" watchObservedRunningTime="2026-04-16 18:22:19.72350866 +0000 UTC m=+409.812058072" Apr 16 18:22:23.100526 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.100494 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-sbq52"] Apr 16 18:22:23.102659 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.102642 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.105327 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.105305 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 18:22:23.105546 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.105528 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-pmwz8\"" Apr 16 18:22:23.106486 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.106470 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 18:22:23.118145 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.118124 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-sbq52"] Apr 16 18:22:23.265974 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.265940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5m4\" (UniqueName: \"kubernetes.io/projected/8bf3c3e5-2de8-4f43-8210-f00996a3d904-kube-api-access-cj5m4\") pod \"cert-manager-webhook-597b96b99b-sbq52\" (UID: \"8bf3c3e5-2de8-4f43-8210-f00996a3d904\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.266158 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.265988 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bf3c3e5-2de8-4f43-8210-f00996a3d904-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-sbq52\" (UID: \"8bf3c3e5-2de8-4f43-8210-f00996a3d904\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.367124 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.367034 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5m4\" (UniqueName: \"kubernetes.io/projected/8bf3c3e5-2de8-4f43-8210-f00996a3d904-kube-api-access-cj5m4\") pod \"cert-manager-webhook-597b96b99b-sbq52\" (UID: \"8bf3c3e5-2de8-4f43-8210-f00996a3d904\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.367124 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.367096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bf3c3e5-2de8-4f43-8210-f00996a3d904-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-sbq52\" (UID: \"8bf3c3e5-2de8-4f43-8210-f00996a3d904\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.387706 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.387681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5m4\" (UniqueName: \"kubernetes.io/projected/8bf3c3e5-2de8-4f43-8210-f00996a3d904-kube-api-access-cj5m4\") pod \"cert-manager-webhook-597b96b99b-sbq52\" (UID: \"8bf3c3e5-2de8-4f43-8210-f00996a3d904\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.389874 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.389853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8bf3c3e5-2de8-4f43-8210-f00996a3d904-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-sbq52\" (UID: \"8bf3c3e5-2de8-4f43-8210-f00996a3d904\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.410849 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.410828 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:23.529793 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.529768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-sbq52"] Apr 16 18:22:23.532720 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:22:23.532690 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bf3c3e5_2de8_4f43_8210_f00996a3d904.slice/crio-e0464edff6c99d086d5139e6370f2694f74a8424bc6d269395700b11d95b01dd WatchSource:0}: Error finding container e0464edff6c99d086d5139e6370f2694f74a8424bc6d269395700b11d95b01dd: Status 404 returned error can't find the container with id e0464edff6c99d086d5139e6370f2694f74a8424bc6d269395700b11d95b01dd Apr 16 18:22:23.715062 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:23.714983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" event={"ID":"8bf3c3e5-2de8-4f43-8210-f00996a3d904","Type":"ContainerStarted","Data":"e0464edff6c99d086d5139e6370f2694f74a8424bc6d269395700b11d95b01dd"} Apr 16 18:22:25.301094 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.301046 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xn2q6"] Apr 16 18:22:25.303491 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.303467 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.306427 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.306401 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-6c2ln\"" Apr 16 18:22:25.311473 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.311450 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xn2q6"] Apr 16 18:22:25.484883 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.484850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f138f103-2be8-4454-ab6a-cdbd630cf887-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xn2q6\" (UID: \"f138f103-2be8-4454-ab6a-cdbd630cf887\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.485050 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.484897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcd7s\" (UniqueName: \"kubernetes.io/projected/f138f103-2be8-4454-ab6a-cdbd630cf887-kube-api-access-jcd7s\") pod \"cert-manager-cainjector-8966b78d4-xn2q6\" (UID: \"f138f103-2be8-4454-ab6a-cdbd630cf887\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.585962 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.585921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f138f103-2be8-4454-ab6a-cdbd630cf887-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xn2q6\" (UID: \"f138f103-2be8-4454-ab6a-cdbd630cf887\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.586159 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.585970 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcd7s\" (UniqueName: \"kubernetes.io/projected/f138f103-2be8-4454-ab6a-cdbd630cf887-kube-api-access-jcd7s\") pod \"cert-manager-cainjector-8966b78d4-xn2q6\" (UID: \"f138f103-2be8-4454-ab6a-cdbd630cf887\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.595239 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.595210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f138f103-2be8-4454-ab6a-cdbd630cf887-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xn2q6\" (UID: \"f138f103-2be8-4454-ab6a-cdbd630cf887\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.595472 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.595446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcd7s\" (UniqueName: \"kubernetes.io/projected/f138f103-2be8-4454-ab6a-cdbd630cf887-kube-api-access-jcd7s\") pod \"cert-manager-cainjector-8966b78d4-xn2q6\" (UID: \"f138f103-2be8-4454-ab6a-cdbd630cf887\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:25.615147 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:25.615123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" Apr 16 18:22:26.453705 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.453680 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xn2q6"] Apr 16 18:22:26.456975 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:22:26.456940 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf138f103_2be8_4454_ab6a_cdbd630cf887.slice/crio-68f859f3c1537ed224a6c34d39d4de49612af61c3bcbb71f5904bd73e6c95e3a WatchSource:0}: Error finding container 68f859f3c1537ed224a6c34d39d4de49612af61c3bcbb71f5904bd73e6c95e3a: Status 404 returned error can't find the container with id 68f859f3c1537ed224a6c34d39d4de49612af61c3bcbb71f5904bd73e6c95e3a Apr 16 18:22:26.725262 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.725179 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" event={"ID":"8bf3c3e5-2de8-4f43-8210-f00996a3d904","Type":"ContainerStarted","Data":"b8826986f7880f1698c00413a6bb8afaff4bf936507967192016ed646c3b85ac"} Apr 16 18:22:26.725262 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.725244 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:26.726483 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.726459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" event={"ID":"f138f103-2be8-4454-ab6a-cdbd630cf887","Type":"ContainerStarted","Data":"463a804515c3603bba8b9429746322d00e780bd5a667ec74d8edcd8c8a1ea3f9"} Apr 16 18:22:26.726559 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.726491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" event={"ID":"f138f103-2be8-4454-ab6a-cdbd630cf887","Type":"ContainerStarted","Data":"68f859f3c1537ed224a6c34d39d4de49612af61c3bcbb71f5904bd73e6c95e3a"} Apr 16 18:22:26.743964 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.743916 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" podStartSLOduration=0.89668666 podStartE2EDuration="3.743904309s" podCreationTimestamp="2026-04-16 18:22:23 +0000 UTC" firstStartedPulling="2026-04-16 18:22:23.534473165 +0000 UTC m=+413.623022556" lastFinishedPulling="2026-04-16 18:22:26.381690803 +0000 UTC m=+416.470240205" observedRunningTime="2026-04-16 18:22:26.742511299 +0000 UTC m=+416.831060722" watchObservedRunningTime="2026-04-16 18:22:26.743904309 +0000 UTC m=+416.832453721" Apr 16 18:22:26.761629 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:26.761588 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-xn2q6" podStartSLOduration=1.761574567 podStartE2EDuration="1.761574567s" podCreationTimestamp="2026-04-16 18:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:26.760528344 +0000 UTC m=+416.849077757" watchObservedRunningTime="2026-04-16 18:22:26.761574567 +0000 UTC m=+416.850123979" Apr 16 18:22:32.730892 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:32.730857 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-sbq52" Apr 16 18:22:42.023032 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.022999 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-8x62v"] Apr 16 18:22:42.025130 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.025114 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.027585 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.027555 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-nbb2j\"" Apr 16 18:22:42.036853 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.036823 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-8x62v"] Apr 16 18:22:42.099766 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.099734 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1df8215d-6a22-450e-ac6d-2efef2cad60b-bound-sa-token\") pod \"cert-manager-759f64656b-8x62v\" (UID: \"1df8215d-6a22-450e-ac6d-2efef2cad60b\") " pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.099901 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.099780 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvq4n\" (UniqueName: \"kubernetes.io/projected/1df8215d-6a22-450e-ac6d-2efef2cad60b-kube-api-access-wvq4n\") pod \"cert-manager-759f64656b-8x62v\" (UID: \"1df8215d-6a22-450e-ac6d-2efef2cad60b\") " pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.200640 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.200611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvq4n\" (UniqueName: \"kubernetes.io/projected/1df8215d-6a22-450e-ac6d-2efef2cad60b-kube-api-access-wvq4n\") pod \"cert-manager-759f64656b-8x62v\" (UID: \"1df8215d-6a22-450e-ac6d-2efef2cad60b\") " pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.200762 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.200680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1df8215d-6a22-450e-ac6d-2efef2cad60b-bound-sa-token\") pod \"cert-manager-759f64656b-8x62v\" (UID: \"1df8215d-6a22-450e-ac6d-2efef2cad60b\") " pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.209063 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.209031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1df8215d-6a22-450e-ac6d-2efef2cad60b-bound-sa-token\") pod \"cert-manager-759f64656b-8x62v\" (UID: \"1df8215d-6a22-450e-ac6d-2efef2cad60b\") " pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.209178 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.209101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvq4n\" (UniqueName: \"kubernetes.io/projected/1df8215d-6a22-450e-ac6d-2efef2cad60b-kube-api-access-wvq4n\") pod \"cert-manager-759f64656b-8x62v\" (UID: \"1df8215d-6a22-450e-ac6d-2efef2cad60b\") " pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.334286 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.334258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-8x62v" Apr 16 18:22:42.453253 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.453223 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-8x62v"] Apr 16 18:22:42.456612 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:22:42.456585 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df8215d_6a22_450e_ac6d_2efef2cad60b.slice/crio-8919ea1b953f791d51ebb9c97f77b4f4dc76fb0c81dbd97533af38be0f5466c8 WatchSource:0}: Error finding container 8919ea1b953f791d51ebb9c97f77b4f4dc76fb0c81dbd97533af38be0f5466c8: Status 404 returned error can't find the container with id 8919ea1b953f791d51ebb9c97f77b4f4dc76fb0c81dbd97533af38be0f5466c8 Apr 16 18:22:42.775614 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.775535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-8x62v" event={"ID":"1df8215d-6a22-450e-ac6d-2efef2cad60b","Type":"ContainerStarted","Data":"6f69fc3877eeace5603fbea947250c700724712f5608a7514882bc7eb3616505"} Apr 16 18:22:42.775614 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.775572 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-8x62v" event={"ID":"1df8215d-6a22-450e-ac6d-2efef2cad60b","Type":"ContainerStarted","Data":"8919ea1b953f791d51ebb9c97f77b4f4dc76fb0c81dbd97533af38be0f5466c8"} Apr 16 18:22:42.794296 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:22:42.794193 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-8x62v" podStartSLOduration=0.794176285 podStartE2EDuration="794.176285ms" podCreationTimestamp="2026-04-16 18:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:42.793564389 +0000 UTC m=+432.882113802" watchObservedRunningTime="2026-04-16 18:22:42.794176285 +0000 UTC m=+432.882725699" Apr 16 18:23:13.078680 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.078644 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5d577d5585-5j685"] Apr 16 18:23:13.082139 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.082123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.085073 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.085050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 18:23:13.085201 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.085114 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:23:13.086073 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.086055 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:23:13.086131 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.086112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 18:23:13.086325 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.086309 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 18:23:13.086476 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.086459 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-hzx2r\"" Apr 16 18:23:13.094634 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.094616 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5d577d5585-5j685"] Apr 16 18:23:13.128976 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.128953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95c65514-2913-4cdc-b1a1-bcda73619d7c-metrics-cert\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.129096 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.128991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw8vj\" (UniqueName: \"kubernetes.io/projected/95c65514-2913-4cdc-b1a1-bcda73619d7c-kube-api-access-zw8vj\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.129096 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.129030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c65514-2913-4cdc-b1a1-bcda73619d7c-cert\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.129096 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.129048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95c65514-2913-4cdc-b1a1-bcda73619d7c-manager-config\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.230096 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.230041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw8vj\" (UniqueName: \"kubernetes.io/projected/95c65514-2913-4cdc-b1a1-bcda73619d7c-kube-api-access-zw8vj\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.230259 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.230136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c65514-2913-4cdc-b1a1-bcda73619d7c-cert\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.230259 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.230172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95c65514-2913-4cdc-b1a1-bcda73619d7c-manager-config\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.230259 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.230210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95c65514-2913-4cdc-b1a1-bcda73619d7c-metrics-cert\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.230847 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.230824 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95c65514-2913-4cdc-b1a1-bcda73619d7c-manager-config\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.232548 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.232522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c65514-2913-4cdc-b1a1-bcda73619d7c-cert\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.232709 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.232687 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95c65514-2913-4cdc-b1a1-bcda73619d7c-metrics-cert\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.250038 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.250012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw8vj\" (UniqueName: \"kubernetes.io/projected/95c65514-2913-4cdc-b1a1-bcda73619d7c-kube-api-access-zw8vj\") pod \"lws-controller-manager-5d577d5585-5j685\" (UID: \"95c65514-2913-4cdc-b1a1-bcda73619d7c\") " pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.391598 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.391561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:13.518052 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.518025 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5d577d5585-5j685"] Apr 16 18:23:13.519985 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:23:13.519956 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c65514_2913_4cdc_b1a1_bcda73619d7c.slice/crio-eea0ca14c5e6f78b684a4f6566781f36a170474a7f0a4a2b1e52ebc318ce54a7 WatchSource:0}: Error finding container eea0ca14c5e6f78b684a4f6566781f36a170474a7f0a4a2b1e52ebc318ce54a7: Status 404 returned error can't find the container with id eea0ca14c5e6f78b684a4f6566781f36a170474a7f0a4a2b1e52ebc318ce54a7 Apr 16 18:23:13.872212 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:13.872179 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" event={"ID":"95c65514-2913-4cdc-b1a1-bcda73619d7c","Type":"ContainerStarted","Data":"eea0ca14c5e6f78b684a4f6566781f36a170474a7f0a4a2b1e52ebc318ce54a7"} Apr 16 18:23:16.884108 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:16.884051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" event={"ID":"95c65514-2913-4cdc-b1a1-bcda73619d7c","Type":"ContainerStarted","Data":"c437eb9c70d7762f8f3024f17fb76b425cae4c6855053f19fc15e40e848bb29d"} Apr 16 18:23:16.884489 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:16.884134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:16.915961 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:16.915912 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" podStartSLOduration=1.4341435200000001 podStartE2EDuration="3.915897892s" podCreationTimestamp="2026-04-16 18:23:13 +0000 UTC" firstStartedPulling="2026-04-16 18:23:13.521981565 +0000 UTC m=+463.610530957" lastFinishedPulling="2026-04-16 18:23:16.003735923 +0000 UTC m=+466.092285329" observedRunningTime="2026-04-16 18:23:16.913853156 +0000 UTC m=+467.002402570" watchObservedRunningTime="2026-04-16 18:23:16.915897892 +0000 UTC m=+467.004447305" Apr 16 18:23:27.890169 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:27.890137 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5d577d5585-5j685" Apr 16 18:23:43.375618 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.375589 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl"] Apr 16 18:23:43.378851 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.378835 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:43.381732 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.381699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-dntcw\"" Apr 16 18:23:43.381857 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.381786 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 18:23:43.381857 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.381785 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:23:43.382686 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.382668 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:23:43.399515 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.399493 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl"] Apr 16 18:23:43.464581 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.464556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpkzf\" (UniqueName: \"kubernetes.io/projected/35c10eca-559b-4497-9fe2-69b68d70e723-kube-api-access-xpkzf\") pod \"dns-operator-controller-manager-844548ff4c-drxnl\" (UID: \"35c10eca-559b-4497-9fe2-69b68d70e723\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:43.565405 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.565375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpkzf\" (UniqueName: \"kubernetes.io/projected/35c10eca-559b-4497-9fe2-69b68d70e723-kube-api-access-xpkzf\") pod \"dns-operator-controller-manager-844548ff4c-drxnl\" (UID: \"35c10eca-559b-4497-9fe2-69b68d70e723\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:43.579983 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.579963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpkzf\" (UniqueName: \"kubernetes.io/projected/35c10eca-559b-4497-9fe2-69b68d70e723-kube-api-access-xpkzf\") pod \"dns-operator-controller-manager-844548ff4c-drxnl\" (UID: \"35c10eca-559b-4497-9fe2-69b68d70e723\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:43.688796 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.688732 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:43.816509 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.816473 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl"] Apr 16 18:23:43.819470 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:23:43.819443 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c10eca_559b_4497_9fe2_69b68d70e723.slice/crio-ed46d8b5f8eb8da8299304e513a15969c1831930071fc0cafaa7d918e76e32e6 WatchSource:0}: Error finding container ed46d8b5f8eb8da8299304e513a15969c1831930071fc0cafaa7d918e76e32e6: Status 404 returned error can't find the container with id ed46d8b5f8eb8da8299304e513a15969c1831930071fc0cafaa7d918e76e32e6 Apr 16 18:23:43.966317 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:43.966226 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" event={"ID":"35c10eca-559b-4497-9fe2-69b68d70e723","Type":"ContainerStarted","Data":"ed46d8b5f8eb8da8299304e513a15969c1831930071fc0cafaa7d918e76e32e6"} Apr 16 18:23:46.978690 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:46.978659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" event={"ID":"35c10eca-559b-4497-9fe2-69b68d70e723","Type":"ContainerStarted","Data":"c8bb162f013a8440b5708f28dcd3e4a004f9dd08d55823f9911b0e254af8dea5"} Apr 16 18:23:46.979105 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:46.978714 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:46.997474 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:46.997433 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" podStartSLOduration=1.2646736619999999 podStartE2EDuration="3.99741876s" podCreationTimestamp="2026-04-16 18:23:43 +0000 UTC" firstStartedPulling="2026-04-16 18:23:43.82180309 +0000 UTC m=+493.910352496" lastFinishedPulling="2026-04-16 18:23:46.55454819 +0000 UTC m=+496.643097594" observedRunningTime="2026-04-16 18:23:46.996898329 +0000 UTC m=+497.085447743" watchObservedRunningTime="2026-04-16 18:23:46.99741876 +0000 UTC m=+497.085968172" Apr 16 18:23:47.253058 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.252977 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b69b67b75-fw45q"] Apr 16 18:23:47.257195 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.256030 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.259551 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.259529 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:23:47.259683 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.259530 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:23:47.260596 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.260573 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:23:47.260717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.260597 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:23:47.260717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.260581 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bkqlw\"" Apr 16 18:23:47.260717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.260602 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:23:47.271036 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.271017 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b69b67b75-fw45q"] Apr 16 18:23:47.271439 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.271419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:23:47.396062 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pkkl\" (UniqueName: \"kubernetes.io/projected/4e7783e5-2d88-49fe-b273-0c5b49cdae71-kube-api-access-6pkkl\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.396255 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-oauth-config\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.396255 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-config\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.396255 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-trusted-ca-bundle\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.396386 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-serving-cert\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.396386 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-service-ca\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.396386 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.396320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-oauth-serving-cert\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497403 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-serving-cert\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497588 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-service-ca\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497588 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-oauth-serving-cert\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497588 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pkkl\" (UniqueName: \"kubernetes.io/projected/4e7783e5-2d88-49fe-b273-0c5b49cdae71-kube-api-access-6pkkl\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497588 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-oauth-config\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497809 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-config\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.497809 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.497751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-trusted-ca-bundle\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.498324 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.498303 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-oauth-serving-cert\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.498324 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.498308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-service-ca\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.498488 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.498377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-config\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.498547 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.498494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7783e5-2d88-49fe-b273-0c5b49cdae71-trusted-ca-bundle\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.499957 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.499932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-serving-cert\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.499957 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.499951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e7783e5-2d88-49fe-b273-0c5b49cdae71-console-oauth-config\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.505847 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.505794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pkkl\" (UniqueName: \"kubernetes.io/projected/4e7783e5-2d88-49fe-b273-0c5b49cdae71-kube-api-access-6pkkl\") pod \"console-5b69b67b75-fw45q\" (UID: \"4e7783e5-2d88-49fe-b273-0c5b49cdae71\") " pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.568750 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.568697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:47.695337 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.695315 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b69b67b75-fw45q"] Apr 16 18:23:47.698114 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:23:47.698086 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7783e5_2d88_49fe_b273_0c5b49cdae71.slice/crio-74ae0d273bb2ac0c7b2c09188584959b1d09a31ccd85b5f96bbd042f5533c19b WatchSource:0}: Error finding container 74ae0d273bb2ac0c7b2c09188584959b1d09a31ccd85b5f96bbd042f5533c19b: Status 404 returned error can't find the container with id 74ae0d273bb2ac0c7b2c09188584959b1d09a31ccd85b5f96bbd042f5533c19b Apr 16 18:23:47.983750 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.983715 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b69b67b75-fw45q" event={"ID":"4e7783e5-2d88-49fe-b273-0c5b49cdae71","Type":"ContainerStarted","Data":"9b73c1807fa5ab4e3551389d94766ca757e87c283ee8ec432216e2cff6e8dc81"} Apr 16 18:23:47.983750 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:47.983752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b69b67b75-fw45q" event={"ID":"4e7783e5-2d88-49fe-b273-0c5b49cdae71","Type":"ContainerStarted","Data":"74ae0d273bb2ac0c7b2c09188584959b1d09a31ccd85b5f96bbd042f5533c19b"} Apr 16 18:23:48.005978 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:48.005925 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b69b67b75-fw45q" podStartSLOduration=1.005908495 podStartE2EDuration="1.005908495s" podCreationTimestamp="2026-04-16 18:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:23:48.003956528 +0000 UTC m=+498.092505941" watchObservedRunningTime="2026-04-16 18:23:48.005908495 +0000 UTC m=+498.094457910" Apr 16 18:23:57.569452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:57.569419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:57.569452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:57.569455 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:57.574285 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:57.574263 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:23:57.986576 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:57.986500 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-drxnl" Apr 16 18:23:58.017048 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:23:58.017018 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b69b67b75-fw45q" Apr 16 18:25:30.390869 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:25:30.390839 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:25:30.391448 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:25:30.391265 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:30:30.410278 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:30:30.410204 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:30:30.410855 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:30:30.410321 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:33:43.132425 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.132343 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-stfls/must-gather-cttbx"] Apr 16 18:33:43.135734 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.135712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.138421 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.138400 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-stfls\"/\"default-dockercfg-hkchx\"" Apr 16 18:33:43.138687 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.138673 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-stfls\"/\"openshift-service-ca.crt\"" Apr 16 18:33:43.139571 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.139547 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-stfls\"/\"kube-root-ca.crt\"" Apr 16 18:33:43.158861 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.158835 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-stfls/must-gather-cttbx"] Apr 16 18:33:43.210899 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.210858 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-must-gather-output\") pod \"must-gather-cttbx\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.210899 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.210913 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc4pd\" (UniqueName: \"kubernetes.io/projected/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-kube-api-access-hc4pd\") pod \"must-gather-cttbx\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.312197 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.312165 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-must-gather-output\") pod \"must-gather-cttbx\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.312405 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.312211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc4pd\" (UniqueName: \"kubernetes.io/projected/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-kube-api-access-hc4pd\") pod \"must-gather-cttbx\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.312579 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.312559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-must-gather-output\") pod \"must-gather-cttbx\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.324256 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.324225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc4pd\" (UniqueName: \"kubernetes.io/projected/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-kube-api-access-hc4pd\") pod \"must-gather-cttbx\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.447853 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.447756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:33:43.569814 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.569786 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-stfls/must-gather-cttbx"] Apr 16 18:33:43.572102 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:33:43.572058 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4e792b_e8f5_4efe_b321_bfee9bb793bf.slice/crio-2030ad498334a7fa14a8a57044f65dc2ac8115a6d2ad88baebec15490e57143e WatchSource:0}: Error finding container 2030ad498334a7fa14a8a57044f65dc2ac8115a6d2ad88baebec15490e57143e: Status 404 returned error can't find the container with id 2030ad498334a7fa14a8a57044f65dc2ac8115a6d2ad88baebec15490e57143e Apr 16 18:33:43.573711 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.573694 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:33:43.857537 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:43.857503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-stfls/must-gather-cttbx" event={"ID":"bd4e792b-e8f5-4efe-b321-bfee9bb793bf","Type":"ContainerStarted","Data":"2030ad498334a7fa14a8a57044f65dc2ac8115a6d2ad88baebec15490e57143e"} Apr 16 18:33:47.876092 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:47.876041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-stfls/must-gather-cttbx" event={"ID":"bd4e792b-e8f5-4efe-b321-bfee9bb793bf","Type":"ContainerStarted","Data":"af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f"} Apr 16 18:33:48.881065 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:48.881027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-stfls/must-gather-cttbx" event={"ID":"bd4e792b-e8f5-4efe-b321-bfee9bb793bf","Type":"ContainerStarted","Data":"c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd"} Apr 16 18:33:48.899821 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:48.899770 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-stfls/must-gather-cttbx" podStartSLOduration=1.750672648 podStartE2EDuration="5.899751895s" podCreationTimestamp="2026-04-16 18:33:43 +0000 UTC" firstStartedPulling="2026-04-16 18:33:43.573818347 +0000 UTC m=+1093.662367738" lastFinishedPulling="2026-04-16 18:33:47.722897591 +0000 UTC m=+1097.811446985" observedRunningTime="2026-04-16 18:33:48.89860736 +0000 UTC m=+1098.987156773" watchObservedRunningTime="2026-04-16 18:33:48.899751895 +0000 UTC m=+1098.988301312" Apr 16 18:33:59.918878 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:59.918839 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerID="af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f" exitCode=0 Apr 16 18:33:59.918878 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:59.918872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-stfls/must-gather-cttbx" event={"ID":"bd4e792b-e8f5-4efe-b321-bfee9bb793bf","Type":"ContainerDied","Data":"af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f"} Apr 16 18:33:59.919352 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:33:59.919228 2573 scope.go:117] "RemoveContainer" containerID="af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f" Apr 16 18:34:00.812582 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:00.812550 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-stfls_must-gather-cttbx_bd4e792b-e8f5-4efe-b321-bfee9bb793bf/gather/0.log" Apr 16 18:34:04.147533 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:04.147496 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-k4g48_b343ce6d-77fa-4692-ab8c-61876213aed4/global-pull-secret-syncer/0.log" Apr 16 18:34:04.268881 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:04.268848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q7ctb_7a47f3a5-f4cc-445c-a5c0-e2a5af6a9ea3/konnectivity-agent/0.log" Apr 16 18:34:04.355467 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:04.355437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-125.ec2.internal_50124a769f014399bcd19764e87e11fb/haproxy/0.log" Apr 16 18:34:06.285944 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.285900 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-stfls/must-gather-cttbx"] Apr 16 18:34:06.286424 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.286194 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-stfls/must-gather-cttbx" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="copy" containerID="cri-o://c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd" gracePeriod=2 Apr 16 18:34:06.287452 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.287431 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-stfls/must-gather-cttbx"] Apr 16 18:34:06.289013 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.288985 2573 status_manager.go:895] "Failed to get status for pod" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" pod="openshift-must-gather-stfls/must-gather-cttbx" err="pods \"must-gather-cttbx\" is forbidden: User \"system:node:ip-10-0-135-125.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-stfls\": no relationship found between node 'ip-10-0-135-125.ec2.internal' and this object" Apr 16 18:34:06.516330 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.516307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-stfls_must-gather-cttbx_bd4e792b-e8f5-4efe-b321-bfee9bb793bf/copy/0.log" Apr 16 18:34:06.516664 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.516649 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:34:06.605226 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.605193 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc4pd\" (UniqueName: \"kubernetes.io/projected/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-kube-api-access-hc4pd\") pod \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " Apr 16 18:34:06.605400 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.605284 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-must-gather-output\") pod \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\" (UID: \"bd4e792b-e8f5-4efe-b321-bfee9bb793bf\") " Apr 16 18:34:06.607253 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.607225 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bd4e792b-e8f5-4efe-b321-bfee9bb793bf" (UID: "bd4e792b-e8f5-4efe-b321-bfee9bb793bf"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:06.607423 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.607399 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-kube-api-access-hc4pd" (OuterVolumeSpecName: "kube-api-access-hc4pd") pod "bd4e792b-e8f5-4efe-b321-bfee9bb793bf" (UID: "bd4e792b-e8f5-4efe-b321-bfee9bb793bf"). InnerVolumeSpecName "kube-api-access-hc4pd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:06.706585 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.706549 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-must-gather-output\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:34:06.706585 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.706579 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hc4pd\" (UniqueName: \"kubernetes.io/projected/bd4e792b-e8f5-4efe-b321-bfee9bb793bf-kube-api-access-hc4pd\") on node \"ip-10-0-135-125.ec2.internal\" DevicePath \"\"" Apr 16 18:34:06.942298 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.942218 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-stfls_must-gather-cttbx_bd4e792b-e8f5-4efe-b321-bfee9bb793bf/copy/0.log" Apr 16 18:34:06.942577 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.942553 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerID="c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd" exitCode=143 Apr 16 18:34:06.942632 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.942610 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-stfls/must-gather-cttbx" Apr 16 18:34:06.942727 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.942615 2573 scope.go:117] "RemoveContainer" containerID="c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd" Apr 16 18:34:06.951008 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.950916 2573 scope.go:117] "RemoveContainer" containerID="af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f" Apr 16 18:34:06.964228 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.964208 2573 scope.go:117] "RemoveContainer" containerID="c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd" Apr 16 18:34:06.964642 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:34:06.964614 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd\": container with ID starting with c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd not found: ID does not exist" containerID="c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd" Apr 16 18:34:06.964717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.964645 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd"} err="failed to get container status \"c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd\": rpc error: code = NotFound desc = could not find container \"c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd\": container with ID starting with c56a111f74f8bfab839d2cb7c55c06a2e7adc0deb1db57de2083c4cff8fee3fd not found: ID does not exist" Apr 16 18:34:06.964717 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.964665 2573 scope.go:117] "RemoveContainer" containerID="af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f" Apr 16 18:34:06.964975 ip-10-0-135-125 kubenswrapper[2573]: E0416 18:34:06.964957 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f\": container with ID starting with af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f not found: ID does not exist" containerID="af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f" Apr 16 18:34:06.965015 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:06.964981 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f"} err="failed to get container status \"af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f\": rpc error: code = NotFound desc = could not find container \"af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f\": container with ID starting with af81343b24089c403152434a350d003f1b65f1322cd56cfce07a48d3e0e0890f not found: ID does not exist" Apr 16 18:34:08.504397 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:08.504367 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" path="/var/lib/kubelet/pods/bd4e792b-e8f5-4efe-b321-bfee9bb793bf/volumes" Apr 16 18:34:08.851500 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:08.851465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-92cxf_e1a285c0-f532-4ac2-821c-860f34c577d3/monitoring-plugin/0.log" Apr 16 18:34:09.081972 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:09.081937 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7z8sx_edf117dd-4ff2-489b-b752-a7843cb792f3/node-exporter/0.log" Apr 16 18:34:09.116426 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:09.116354 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7z8sx_edf117dd-4ff2-489b-b752-a7843cb792f3/kube-rbac-proxy/0.log" Apr 16 18:34:09.155318 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:09.155294 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7z8sx_edf117dd-4ff2-489b-b752-a7843cb792f3/init-textfile/0.log" Apr 16 18:34:09.566605 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:09.566582 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-bbjsv_5cd34ba5-a09f-4513-a54c-72a781092903/prometheus-operator-admission-webhook/0.log" Apr 16 18:34:12.452443 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:12.452419 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b69b67b75-fw45q_4e7783e5-2d88-49fe-b273-0c5b49cdae71/console/0.log" Apr 16 18:34:12.484937 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:12.484911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-sqmkv_1f2e7243-0f55-4aaf-9db9-f670dd987553/download-server/0.log" Apr 16 18:34:13.180110 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180062 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2"] Apr 16 18:34:13.180382 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180369 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="gather" Apr 16 18:34:13.180429 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180383 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="gather" Apr 16 18:34:13.180429 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180420 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="copy" Apr 16 18:34:13.180429 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180426 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="copy" Apr 16 18:34:13.180530 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180496 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="copy" Apr 16 18:34:13.180530 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.180504 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd4e792b-e8f5-4efe-b321-bfee9bb793bf" containerName="gather" Apr 16 18:34:13.185933 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.185911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.188647 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.188626 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5b6zt\"/\"openshift-service-ca.crt\"" Apr 16 18:34:13.189592 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.189575 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5b6zt\"/\"kube-root-ca.crt\"" Apr 16 18:34:13.189676 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.189575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5b6zt\"/\"default-dockercfg-nlsgc\"" Apr 16 18:34:13.192872 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.192845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2"] Apr 16 18:34:13.258684 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.258647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-sys\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.258862 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.258700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-lib-modules\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.258862 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.258802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-proc\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.258862 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.258837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-podres\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.258957 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.258902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwtn\" (UniqueName: \"kubernetes.io/projected/6164106a-9e46-436e-a7b1-032247635586-kube-api-access-nzwtn\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.359974 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.359940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-podres\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.359974 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.359985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwtn\" (UniqueName: \"kubernetes.io/projected/6164106a-9e46-436e-a7b1-032247635586-kube-api-access-nzwtn\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-sys\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-lib-modules\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-proc\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-proc\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-podres\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-sys\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.360264 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.360206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6164106a-9e46-436e-a7b1-032247635586-lib-modules\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.368916 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.368885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwtn\" (UniqueName: \"kubernetes.io/projected/6164106a-9e46-436e-a7b1-032247635586-kube-api-access-nzwtn\") pod \"perf-node-gather-daemonset-22pf2\" (UID: \"6164106a-9e46-436e-a7b1-032247635586\") " pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.496504 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.496405 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.619591 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.619552 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2"] Apr 16 18:34:13.622485 ip-10-0-135-125 kubenswrapper[2573]: W0416 18:34:13.622457 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6164106a_9e46_436e_a7b1_032247635586.slice/crio-92cf0c8e5f79c8665e7f7f9179622702ad1e52ad7d56cdbef8f7f22c71b8ee8d WatchSource:0}: Error finding container 92cf0c8e5f79c8665e7f7f9179622702ad1e52ad7d56cdbef8f7f22c71b8ee8d: Status 404 returned error can't find the container with id 92cf0c8e5f79c8665e7f7f9179622702ad1e52ad7d56cdbef8f7f22c71b8ee8d Apr 16 18:34:13.905923 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.905892 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xbf7w_d9480c7a-7ea2-4833-a0ed-10e03e4bc66d/dns/0.log" Apr 16 18:34:13.936095 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.936053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xbf7w_d9480c7a-7ea2-4833-a0ed-10e03e4bc66d/kube-rbac-proxy/0.log" Apr 16 18:34:13.965928 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.965904 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-82pct_9125009d-485e-443b-85bb-384f2afc6de2/dns-node-resolver/0.log" Apr 16 18:34:13.966194 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.966176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" event={"ID":"6164106a-9e46-436e-a7b1-032247635586","Type":"ContainerStarted","Data":"2383c022b29e495cf5b0058852a060a996d8124759b9cdd4f8fd6bbec920741f"} Apr 16 18:34:13.966296 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.966205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" event={"ID":"6164106a-9e46-436e-a7b1-032247635586","Type":"ContainerStarted","Data":"92cf0c8e5f79c8665e7f7f9179622702ad1e52ad7d56cdbef8f7f22c71b8ee8d"} Apr 16 18:34:13.966357 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.966345 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:13.983903 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:13.983857 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" podStartSLOduration=0.983844545 podStartE2EDuration="983.844545ms" podCreationTimestamp="2026-04-16 18:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:13.982190696 +0000 UTC m=+1124.070740108" watchObservedRunningTime="2026-04-16 18:34:13.983844545 +0000 UTC m=+1124.072393957" Apr 16 18:34:14.502982 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:14.502944 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5cfdd9f895-79lv8_26a76190-c321-4d84-ab5e-db9513250bed/registry/0.log" Apr 16 18:34:14.528210 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:14.528184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8694n_44520956-97f4-441e-acae-e1c5b82de2ea/node-ca/0.log" Apr 16 18:34:15.970684 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:15.970652 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4rlf8_61e9f3e6-a9f4-46cd-b1c0-cbdc48158979/serve-healthcheck-canary/0.log" Apr 16 18:34:16.584845 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:16.584817 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mkfg8_a4e48994-8aa5-499b-9d41-ca0fd8b95a04/kube-rbac-proxy/0.log" Apr 16 18:34:16.615543 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:16.615517 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mkfg8_a4e48994-8aa5-499b-9d41-ca0fd8b95a04/exporter/0.log" Apr 16 18:34:16.644289 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:16.644266 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mkfg8_a4e48994-8aa5-499b-9d41-ca0fd8b95a04/extractor/0.log" Apr 16 18:34:18.955968 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:18.955938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5d577d5585-5j685_95c65514-2913-4cdc-b1a1-bcda73619d7c/manager/0.log" Apr 16 18:34:19.978893 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:19.978864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5b6zt/perf-node-gather-daemonset-22pf2" Apr 16 18:34:23.069586 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:23.069561 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-zp6tx_4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad/migrator/0.log" Apr 16 18:34:23.113755 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:23.113731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-zp6tx_4d0a9f4e-bdd1-416b-8dc9-28df41dd54ad/graceful-termination/0.log" Apr 16 18:34:23.588553 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:23.588520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-crpf7_fd04a1c3-675f-42ec-b892-a37ac5e7f02c/kube-storage-version-migrator-operator/1.log" Apr 16 18:34:23.589478 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:23.589461 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-crpf7_fd04a1c3-675f-42ec-b892-a37ac5e7f02c/kube-storage-version-migrator-operator/0.log" Apr 16 18:34:25.125834 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.125802 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/kube-multus-additional-cni-plugins/0.log" Apr 16 18:34:25.155503 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.155474 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/egress-router-binary-copy/0.log" Apr 16 18:34:25.183036 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.183000 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/cni-plugins/0.log" Apr 16 18:34:25.210252 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.210223 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/bond-cni-plugin/0.log" Apr 16 18:34:25.238415 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.238389 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/routeoverride-cni/0.log" Apr 16 18:34:25.266416 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.266389 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/whereabouts-cni-bincopy/0.log" Apr 16 18:34:25.293974 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.293950 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xflf7_533ef48a-edc5-4453-925d-1c6e4b8c3aa0/whereabouts-cni/0.log" Apr 16 18:34:25.327731 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.327700 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dp48w_99b3e81e-b20b-4a7d-a324-a26ce7f61f58/kube-multus/0.log" Apr 16 18:34:25.362636 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.362609 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7rqfn_1c559edc-905d-4a66-b3b6-e4767670d083/network-metrics-daemon/0.log" Apr 16 18:34:25.385922 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:25.385848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7rqfn_1c559edc-905d-4a66-b3b6-e4767670d083/kube-rbac-proxy/0.log" Apr 16 18:34:26.361477 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.361447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-controller/0.log" Apr 16 18:34:26.386306 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.386279 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/0.log" Apr 16 18:34:26.390937 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.390911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovn-acl-logging/1.log" Apr 16 18:34:26.416237 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.416213 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/kube-rbac-proxy-node/0.log" Apr 16 18:34:26.451543 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.451517 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:34:26.477700 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.477659 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/northd/0.log" Apr 16 18:34:26.504292 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.504269 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/nbdb/0.log" Apr 16 18:34:26.535175 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.535149 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/sbdb/0.log" Apr 16 18:34:26.642171 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:26.642100 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-25kmv_c5641439-7e2a-42bc-ae08-1777c6dcb692/ovnkube-controller/0.log" Apr 16 18:34:28.443613 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:28.443565 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-jtdhp_ae307bf6-3540-4f0e-a6ff-9d14408fe2bb/check-endpoints/0.log" Apr 16 18:34:28.470740 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:28.470714 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2xsjc_0e9adccb-918e-48c1-97ed-0c0d8728a3f4/network-check-target-container/0.log" Apr 16 18:34:29.687094 ip-10-0-135-125 kubenswrapper[2573]: I0416 18:34:29.687053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cbp5h_6d0dfe5e-9489-4a73-ae93-e266ba6c0e34/iptables-alerter/0.log"