Apr 24 22:29:51.188430 ip-10-0-133-9 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:51.635985 ip-10-0-133-9 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:51.635985 ip-10-0-133-9 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:51.635985 ip-10-0-133-9 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:51.635985 ip-10-0-133-9 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:51.635985 ip-10-0-133-9 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:51.639143 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.639067 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:51.642561 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642546 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:51.642561 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642561 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642568 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642572 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642575 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642577 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642580 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642583 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642586 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642589 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642592 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642597 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642601 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642604 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642607 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642610 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642612 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642615 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642618 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642620 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:51.642626 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642623 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642625 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642628 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642631 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642634 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642637 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642640 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642643 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642645 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642648 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642651 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642653 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642656 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642658 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642661 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642663 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642665 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642668 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642670 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642673 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:51.643077 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642675 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642678 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642680 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642682 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642685 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642690 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642693 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642695 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642698 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642700 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642703 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642706 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642708 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642711 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642713 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642716 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642718 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642721 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642725 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:51.643615 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642728 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642731 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642734 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642736 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642739 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642742 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642745 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642747 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642750 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642752 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642755 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642757 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642760 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642763 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642765 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642768 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642770 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642773 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642776 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642779 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:51.644086 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642782 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642785 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642787 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642790 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642793 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642795 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.642798 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643188 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643193 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643196 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643199 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643201 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643206 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643209 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643211 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643214 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643224 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643228 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643231 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:51.644584 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643234 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643237 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643240 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643243 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643245 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643248 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643252 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643255 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643258 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643261 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643264 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643267 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643270 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643273 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643276 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643278 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643282 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643285 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643287 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643290 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:51.645032 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643292 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643295 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643297 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643301 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643303 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643306 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643309 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643311 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643314 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643316 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643319 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643321 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643324 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643327 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643329 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643332 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643334 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643336 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643339 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643341 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:51.645538 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643344 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643347 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643349 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643352 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643355 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643357 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643359 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643362 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643365 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643367 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643370 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643372 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643376 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643379 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643381 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643384 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643386 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643389 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643392 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643394 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:51.646070 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643397 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643399 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643402 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643404 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643407 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643409 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643412 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643414 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643417 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643419 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643422 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643424 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643427 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.643429 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643932 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643941 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643951 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643956 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643960 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643964 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643968 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:51.646577 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643973 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643977 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643980 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643984 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643987 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643990 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643994 2571 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643996 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.643999 2571 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644002 2571 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644006 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644010 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644014 2571 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644018 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644021 2571 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644024 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644027 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644031 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644034 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644038 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644040 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644044 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644047 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644050 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644053 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:51.647164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644056 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644060 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644064 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644067 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644070 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644073 2571 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644076 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644080 2571 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644083 2571 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644086 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644090 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644093 2571 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644096 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644099 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644102 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644105 2571 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644108 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644111 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644115 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644118 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644121 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644124 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644127 2571 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644130 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644134 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:51.647777 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644137 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644140 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644143 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644146 2571 flags.go:64] FLAG: --help="false" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644149 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644165 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644168 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644171 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644175 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644179 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644181 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644184 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644187 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644190 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644193 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644196 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644199 2571 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644202 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644205 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644208 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644211 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644214 2571 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644216 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644220 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:51.648424 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644223 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644229 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644232 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644242 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644245 2571 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644248 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644252 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644255 2571 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644258 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644268 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644271 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644275 2571 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644288 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644291 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644294 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644297 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644300 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644303 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644306 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644313 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644316 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644320 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644323 2571 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:51.648992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644326 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644331 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644334 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644337 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644341 2571 flags.go:64] FLAG: --port="10250" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644344 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644347 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c8c1e648c70e6456" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644350 2571 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644353 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644356 2571 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644359 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644362 2571 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644366 2571 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644369 2571 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644372 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644375 2571 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644379 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644383 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644386 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644389 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644392 2571 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644395 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644398 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644401 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644404 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644407 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:51.649595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644410 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644413 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644416 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644419 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644422 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644425 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644427 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644430 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644433 2571 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644436 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644441 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644444 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644446 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644451 2571 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644454 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644459 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644462 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644465 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644468 2571 flags.go:64] FLAG: --v="2" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644472 2571 flags.go:64] FLAG: --version="false" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644477 2571 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644481 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644484 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644581 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:51.650358 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644585 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644588 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644591 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644594 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644597 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644599 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644602 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644604 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644607 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644609 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644612 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644614 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644617 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644619 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644622 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644624 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644627 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644629 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644632 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644634 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:51.651303 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644637 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644639 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644642 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644645 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644648 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644650 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644653 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644656 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644658 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644662 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644664 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644667 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644670 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644672 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644675 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644677 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644680 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644683 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644685 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:51.651871 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644688 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644692 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644695 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644698 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644700 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644703 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644705 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644708 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644710 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644713 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644715 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644718 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644720 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644723 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644725 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644729 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644733 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644737 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644739 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:51.652364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644745 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644748 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644751 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644754 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644757 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644760 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644762 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644765 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644768 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644770 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644773 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644775 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644778 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644780 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644783 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644786 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644788 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644791 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644793 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644795 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:51.652824 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644798 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644801 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644803 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644806 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644808 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644811 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.644813 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.644821 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.652242 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.652258 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652317 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652322 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652325 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652330 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652334 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:51.653328 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652338 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652341 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652344 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652346 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652349 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652351 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652354 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652357 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652359 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652362 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652365 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652367 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652370 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652373 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652375 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652378 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652381 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652383 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652386 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652388 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:51.653708 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652391 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652393 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652396 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652399 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652402 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652405 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652409 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652411 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652414 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652416 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652419 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652422 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652425 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652428 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652430 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652433 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652435 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652438 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652441 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652443 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:51.654276 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652446 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652448 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652452 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652454 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652457 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652459 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652462 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652464 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652467 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652469 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652472 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652474 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652477 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652479 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652482 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652484 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652487 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652491 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652497 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652500 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:51.654761 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652503 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652506 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652509 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652512 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652515 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652518 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652520 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652523 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652525 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652528 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652530 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652534 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652536 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652539 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652542 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652544 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652547 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652550 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652552 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652555 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:51.655266 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652557 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.652562 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652656 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652661 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652664 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652667 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652670 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652672 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652675 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652677 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652680 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652683 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652686 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652688 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652691 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:51.655784 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652694 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652698 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652701 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652704 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652707 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652709 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652712 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652714 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652717 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652721 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652723 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652726 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652729 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652732 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652735 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652737 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652740 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652742 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652745 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652747 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:51.656169 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652750 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652752 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652755 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652757 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652760 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652763 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652765 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652768 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652771 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652773 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652776 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652778 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652780 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652783 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652785 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652788 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652791 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652793 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652797 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:51.656681 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652801 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652804 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652806 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652810 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652813 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652815 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652818 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652821 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652823 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652826 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652828 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652831 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652834 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652837 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652840 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652842 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652844 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652847 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652849 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:51.657148 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652852 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652855 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652858 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652861 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652863 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652866 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652868 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652871 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652874 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652876 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652879 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652881 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652884 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652887 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:51.652889 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.652894 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:51.657634 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.653537 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:51.658020 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.655562 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:51.658020 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.656585 2571 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:51.658020 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.656682 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:51.658020 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.656722 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:51.680697 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.680680 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:51.683181 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.683147 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:51.701257 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.701241 2571 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:51.708406 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.708387 2571 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:51.710332 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.710314 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:51.713584 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.713564 2571 fs.go:135] Filesystem UUIDs: map[692443c6-6256-4a00-af9a-cecd29f9d5f5:/dev/nvme0n1p4 6e691609-d3b2-4480-8066-01fd4aa10b87:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 22:29:51.713662 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.713586 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:51.713819 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.713804 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:51.719515 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.719414 2571 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:51.717548127 +0000 UTC m=+0.414170967 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102351 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec246a2bb68e2b219f94fb2761780c6d SystemUUID:ec246a2b-b68e-2b21-9f94-fb2761780c6d BootID:72b20cda-5b8d-4862-beae-eca6808c1669 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:16:7b:f1:3b:15 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:16:7b:f1:3b:15 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:67:5d:11:a8:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:51.719515 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.719509 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:51.719630 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.719580 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:51.722463 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.722436 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:51.722595 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.722465 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-9.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:51.722644 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.722604 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:51.722644 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.722613 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:51.722644 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.722629 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:51.723444 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.723433 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:51.724825 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.724815 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:51.724931 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.724923 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:51.728293 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.728283 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:51.728344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.728302 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:51.728344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.728318 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:51.728344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.728327 2571 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:51.728344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.728335 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:51.729269 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.729255 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:51.729348 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.729272 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:51.732217 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.732202 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:51.733705 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.733686 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:51.735394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735383 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735399 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735405 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735411 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735417 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735422 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735427 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735433 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735441 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:51.735446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735447 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:51.735783 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735455 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:51.735783 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.735463 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:51.736402 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.736390 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:51.736402 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.736400 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:51.739864 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.739850 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:51.739926 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.739885 2571 server.go:1295] "Started kubelet" Apr 24 22:29:51.740018 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.739973 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:51.740071 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.739978 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:51.740071 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.740067 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:51.740690 ip-10-0-133-9 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:51.741316 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.741301 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:51.742005 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.741986 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:51.744417 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.744389 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-9.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:51.744571 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.744550 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:51.745524 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.744738 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:51.747922 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.747898 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:51.748511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.748494 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:51.749277 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749247 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:51.749368 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749311 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:51.749368 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749316 2571 factory.go:55] Registering systemd factory Apr 24 22:29:51.749368 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749339 2571 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:51.749619 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749323 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:51.749763 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749712 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:51.749763 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749722 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:51.749857 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749806 2571 factory.go:153] Registering CRI-O factory Apr 24 22:29:51.749857 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749818 2571 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:51.749951 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749870 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:51.749951 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749896 2571 factory.go:103] Registering Raw factory Apr 24 22:29:51.749951 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.749914 2571 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:51.750476 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.750451 2571 manager.go:319] Starting recovery of all containers Apr 24 22:29:51.750552 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.750519 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:51.752470 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.747544 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-9.ec2.internal.18a96b8f6dbb6176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-9.ec2.internal,UID:ip-10-0-133-9.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-9.ec2.internal,},FirstTimestamp:2026-04-24 22:29:51.739863414 +0000 UTC m=+0.436486255,LastTimestamp:2026-04-24 22:29:51.739863414 +0000 UTC m=+0.436486255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-9.ec2.internal,}" Apr 24 22:29:51.753309 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.753280 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:29:51.753409 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.753386 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:29:51.754230 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.754187 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:51.761680 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.761665 2571 manager.go:324] Recovery completed Apr 24 22:29:51.766193 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.766181 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:51.768763 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.768746 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:51.768821 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.768776 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:51.768821 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.768788 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:51.769265 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.769248 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:51.769265 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.769264 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:51.769373 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.769281 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:51.770352 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.770294 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-9.ec2.internal.18a96b8f6f7459fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-9.ec2.internal,UID:ip-10-0-133-9.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-9.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-9.ec2.internal,},FirstTimestamp:2026-04-24 22:29:51.768762877 +0000 UTC m=+0.465385718,LastTimestamp:2026-04-24 22:29:51.768762877 +0000 UTC m=+0.465385718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-9.ec2.internal,}" Apr 24 22:29:51.771601 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.771582 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4ctb4" Apr 24 22:29:51.771694 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.771621 2571 policy_none.go:49] "None policy: Start" Apr 24 22:29:51.771694 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.771636 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:51.771694 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.771646 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:51.779336 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.779320 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4ctb4" Apr 24 22:29:51.782360 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.782299 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-9.ec2.internal.18a96b8f6f74a47e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-9.ec2.internal,UID:ip-10-0-133-9.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-9.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-9.ec2.internal,},FirstTimestamp:2026-04-24 22:29:51.76878195 +0000 UTC m=+0.465404790,LastTimestamp:2026-04-24 22:29:51.76878195 +0000 UTC m=+0.465404790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-9.ec2.internal,}" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.809458 2571 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.809495 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.809505 2571 server.go:85] "Starting device plugin registration server" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.809774 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.809787 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.809934 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.810003 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.810009 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.810975 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:51.816467 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.811011 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:51.846917 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.846896 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:51.848226 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.848211 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:51.848307 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.848242 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:51.848307 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.848259 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:51.848307 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.848265 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:51.848307 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.848302 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:51.851252 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.851234 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:51.911609 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.911571 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:51.912523 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.912503 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:51.912587 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.912540 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:51.912587 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.912553 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:51.912587 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.912574 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.925979 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.925961 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.926045 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.925981 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-9.ec2.internal\": node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:51.948399 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.948378 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal"] Apr 24 22:29:51.948468 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.948434 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:51.949648 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.949635 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:51.949712 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.949659 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:51.949712 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.949668 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:51.950766 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.950749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d104bb52543cda5d9dce608c30af4ba4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal\" (UID: \"d104bb52543cda5d9dce608c30af4ba4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.950810 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.950779 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d104bb52543cda5d9dce608c30af4ba4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal\" (UID: \"d104bb52543cda5d9dce608c30af4ba4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.950879 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.950868 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:51.951014 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951000 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.951053 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951029 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:51.951541 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951520 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:51.951541 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951541 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:51.951638 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951554 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:51.951638 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951626 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:51.951709 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951642 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:51.951709 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.951656 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:51.952077 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.952061 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:51.952737 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.952720 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.952782 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.952751 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:51.953364 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.953349 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:51.953445 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.953373 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:51.953445 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:51.953387 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:51.968665 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.968642 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-9.ec2.internal\" not found" node="ip-10-0-133-9.ec2.internal" Apr 24 22:29:51.972239 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:51.972222 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-9.ec2.internal\" not found" node="ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.051053 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.051036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d104bb52543cda5d9dce608c30af4ba4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal\" (UID: \"d104bb52543cda5d9dce608c30af4ba4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.051129 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.051062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d104bb52543cda5d9dce608c30af4ba4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal\" (UID: \"d104bb52543cda5d9dce608c30af4ba4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.051129 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.051081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/05a6b818b62eb67d03d54038b125e714-config\") pod \"kube-apiserver-proxy-ip-10-0-133-9.ec2.internal\" (UID: \"05a6b818b62eb67d03d54038b125e714\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.051213 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.051129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d104bb52543cda5d9dce608c30af4ba4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal\" (UID: \"d104bb52543cda5d9dce608c30af4ba4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.051213 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.051135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d104bb52543cda5d9dce608c30af4ba4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal\" (UID: \"d104bb52543cda5d9dce608c30af4ba4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.053076 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.053065 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.151239 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.151220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/05a6b818b62eb67d03d54038b125e714-config\") pod \"kube-apiserver-proxy-ip-10-0-133-9.ec2.internal\" (UID: \"05a6b818b62eb67d03d54038b125e714\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.151317 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.151260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/05a6b818b62eb67d03d54038b125e714-config\") pod \"kube-apiserver-proxy-ip-10-0-133-9.ec2.internal\" (UID: \"05a6b818b62eb67d03d54038b125e714\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.153688 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.153673 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.254807 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.254753 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.270919 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.270902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.274430 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.274414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.355439 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.355415 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.455883 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.455859 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.556466 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.556414 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.657007 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.656981 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-9.ec2.internal\" not found" Apr 24 22:29:52.657007 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.656995 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:52.657556 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.657129 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:52.714190 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.714170 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:52.717310 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:52.717285 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a6b818b62eb67d03d54038b125e714.slice/crio-b61bc8d4863837867bf883f7e76f56652f1d7707777ef57c29c8008844123848 WatchSource:0}: Error finding container b61bc8d4863837867bf883f7e76f56652f1d7707777ef57c29c8008844123848: Status 404 returned error can't find the container with id b61bc8d4863837867bf883f7e76f56652f1d7707777ef57c29c8008844123848 Apr 24 22:29:52.717635 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:52.717617 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd104bb52543cda5d9dce608c30af4ba4.slice/crio-f0f02bad786a618d8b3317ba7f491c83d44b054281bd92fca23b83432d87775d WatchSource:0}: Error finding container f0f02bad786a618d8b3317ba7f491c83d44b054281bd92fca23b83432d87775d: Status 404 returned error can't find the container with id f0f02bad786a618d8b3317ba7f491c83d44b054281bd92fca23b83432d87775d Apr 24 22:29:52.721943 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.721930 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:52.729135 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.729122 2571 apiserver.go:52] "Watching apiserver" Apr 24 22:29:52.736721 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.736704 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:52.739538 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.739518 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-dwv7l","openshift-ovn-kubernetes/ovnkube-node-46t57","openshift-cluster-node-tuning-operator/tuned-stvmb","openshift-dns/node-resolver-mjsxr","openshift-multus/multus-additional-cni-plugins-n6k9b","kube-system/konnectivity-agent-2zcz2","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4","openshift-image-registry/node-ca-bgw9g","openshift-multus/multus-76v98","openshift-multus/network-metrics-daemon-tphln","openshift-network-diagnostics/network-check-target-ngpww"] Apr 24 22:29:52.742443 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.742417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.742631 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.742606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.743682 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.743663 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.744708 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.744694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.745753 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.745738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.746983 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.746965 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.747982 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.747968 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:52.748075 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.748061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.748844 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.748833 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.749429 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.749411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.751174 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.750891 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-76v98" Apr 24 22:29:52.753831 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.753813 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:52.753915 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.753877 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:29:52.753991 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.753973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754048 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754034 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovnkube-config\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754105 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754063 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysconfig\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754105 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-var-lib-kubelet\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754200 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-netns\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754200 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-hostroot\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754200 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdr5\" (UniqueName: \"kubernetes.io/projected/31c0602b-6394-42d8-b7cc-1a807f7ea065-kube-api-access-6tdr5\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.754294 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-run-netns\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754294 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754236 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-node-log\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754294 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-env-overrides\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754294 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754263 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-modprobe-d\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754294 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754277 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-lib-modules\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754294 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754291 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkh97\" (UniqueName: \"kubernetes.io/projected/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-kube-api-access-gkh97\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754306 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80bdebdd-794b-491a-b6b9-8ac831319fea-host-slash\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-log-socket\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysctl-d\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-k8s-cni-cncf-io\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/80bdebdd-794b-491a-b6b9-8ac831319fea-iptables-alerter-script\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-device-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zrl\" (UniqueName: \"kubernetes.io/projected/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-kube-api-access-62zrl\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/63b43425-e238-4a1e-a63c-4872ab241776-agent-certs\") pod \"konnectivity-agent-2zcz2\" (UID: \"63b43425-e238-4a1e-a63c-4872ab241776\") " pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754491 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-daemon-config\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qthn\" (UniqueName: \"kubernetes.io/projected/664d5264-1f8a-4986-9272-2e8a718a8923-kube-api-access-6qthn\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754576 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-systemd\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-cni-netd\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovnkube-script-lib\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-system-cni-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-socket-dir-parent\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754704 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754703 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/664d5264-1f8a-4986-9272-2e8a718a8923-hosts-file\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-registration-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-kubelet\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-slash\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-var-lib-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-host\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-cni-bin\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/664d5264-1f8a-4986-9272-2e8a718a8923-tmp-dir\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-system-cni-dir\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-os-release\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2v2f\" (UniqueName: \"kubernetes.io/projected/04b9f693-ce4b-4c09-b1df-b3f3382c693c-kube-api-access-r2v2f\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-systemd\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-multus-certs\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-cnibin\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.754992 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.754979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-ovn\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovn-node-metrics-cert\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755023 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-cnibin\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.755072 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-cni-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755105 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755122 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-run-ovn-kubernetes\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-sys\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755181 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-os-release\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-cni-multus\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-kubelet\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-conf-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-sys-fs\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-etc-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-kubernetes\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-tuned\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.755524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755339 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-cni-binary-copy\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755352 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-etc-kubernetes\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e118d567-f878-439c-b74f-3e060f10ac46-serviceca\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-etc-selinux\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755407 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-run\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c09ee36-d808-4615-bfd1-9a6a361f3a56-tmp\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8kw\" (UniqueName: \"kubernetes.io/projected/80bdebdd-794b-491a-b6b9-8ac831319fea-kube-api-access-dd8kw\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klsss\" (UniqueName: \"kubernetes.io/projected/e118d567-f878-439c-b74f-3e060f10ac46-kube-api-access-klsss\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755504 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-systemd-units\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755528 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/63b43425-e238-4a1e-a63c-4872ab241776-konnectivity-ca\") pod \"konnectivity-agent-2zcz2\" (UID: \"63b43425-e238-4a1e-a63c-4872ab241776\") " pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755560 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-cni-bin\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysctl-conf\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755596 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwkj9\" (UniqueName: \"kubernetes.io/projected/2c09ee36-d808-4615-bfd1-9a6a361f3a56-kube-api-access-fwkj9\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e118d567-f878-439c-b74f-3e060f10ac46-host\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.756072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.755726 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-socket-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.757639 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.757607 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:52.757639 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.757610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:52.757798 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.757610 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.757798 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.757612 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.758023 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758008 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.758284 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758269 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.758489 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758474 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.758545 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758504 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:52.758545 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758518 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.758639 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758610 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:52.758689 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758660 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:52.758740 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.758695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dzwh4\"" Apr 24 22:29:52.759029 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759007 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ppgwm\"" Apr 24 22:29:52.759109 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759030 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:52.759109 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759008 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kfp54\"" Apr 24 22:29:52.759234 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759010 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-87hfw\"" Apr 24 22:29:52.759234 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759130 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:52.759316 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759267 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.759353 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759334 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.759401 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759392 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zdncl\"" Apr 24 22:29:52.759451 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759406 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:52.759497 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759465 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.759497 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759482 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:52.759590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759532 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.759590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759557 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:52.759679 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759595 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:52.759679 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759612 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n8x2b\"" Apr 24 22:29:52.759679 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759657 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-js2l7\"" Apr 24 22:29:52.759819 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759762 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-44l7b\"" Apr 24 22:29:52.759819 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759805 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.759956 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.759941 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:52.760060 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.760043 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-f8tbk\"" Apr 24 22:29:52.760183 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.760048 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:52.760183 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.760085 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.763524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.763506 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:52.763757 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.763735 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:52.767766 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.767747 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:52.767869 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.767812 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" Apr 24 22:29:52.768660 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.768643 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal"] Apr 24 22:29:52.770357 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.770336 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:52.781828 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.781804 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:51 +0000 UTC" deadline="2027-12-13 08:27:22.136790977 +0000 UTC" Apr 24 22:29:52.781828 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.781828 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14337h57m29.354966842s" Apr 24 22:29:52.786202 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.786188 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:52.786281 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.786273 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal"] Apr 24 22:29:52.795425 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.795408 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7sxwd" Apr 24 22:29:52.801759 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.801744 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7sxwd" Apr 24 22:29:52.850274 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.850224 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:52.851164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.851119 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" event={"ID":"d104bb52543cda5d9dce608c30af4ba4","Type":"ContainerStarted","Data":"f0f02bad786a618d8b3317ba7f491c83d44b054281bd92fca23b83432d87775d"} Apr 24 22:29:52.851952 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.851930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" event={"ID":"05a6b818b62eb67d03d54038b125e714","Type":"ContainerStarted","Data":"b61bc8d4863837867bf883f7e76f56652f1d7707777ef57c29c8008844123848"} Apr 24 22:29:52.856143 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:52.856236 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-cnibin\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.856236 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-ovn\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.856236 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-cnibin\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovn-node-metrics-cert\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-cnibin\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvgk\" (UniqueName: \"kubernetes.io/projected/12c9d9cf-479c-46fd-9333-94213f4ff2f0-kube-api-access-zwvgk\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-ovn\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-cni-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-cnibin\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856394 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-run-ovn-kubernetes\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-run-ovn-kubernetes\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-cni-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-sys\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-os-release\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856518 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856526 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-sys\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856526 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-cni-multus\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-kubelet\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-cni-multus\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-conf-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-os-release\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856613 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-conf-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-sys-fs\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-etc-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-sys-fs\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.856722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-kubernetes\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-etc-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856732 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-kubelet\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-tuned\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-cni-binary-copy\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-kubernetes\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-etc-kubernetes\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e118d567-f878-439c-b74f-3e060f10ac46-serviceca\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-etc-kubernetes\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-etc-selinux\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-run\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c09ee36-d808-4615-bfd1-9a6a361f3a56-tmp\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8kw\" (UniqueName: \"kubernetes.io/projected/80bdebdd-794b-491a-b6b9-8ac831319fea-kube-api-access-dd8kw\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klsss\" (UniqueName: \"kubernetes.io/projected/e118d567-f878-439c-b74f-3e060f10ac46-kube-api-access-klsss\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.856993 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-etc-selinux\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-run\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-systemd-units\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.857501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/63b43425-e238-4a1e-a63c-4872ab241776-konnectivity-ca\") pod \"konnectivity-agent-2zcz2\" (UID: \"63b43425-e238-4a1e-a63c-4872ab241776\") " pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-systemd-units\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e118d567-f878-439c-b74f-3e060f10ac46-serviceca\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-cni-binary-copy\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-cni-bin\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysctl-conf\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-cni-bin\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwkj9\" (UniqueName: \"kubernetes.io/projected/2c09ee36-d808-4615-bfd1-9a6a361f3a56-kube-api-access-fwkj9\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e118d567-f878-439c-b74f-3e060f10ac46-host\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-socket-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysctl-conf\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovnkube-config\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysconfig\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857642 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e118d567-f878-439c-b74f-3e060f10ac46-host\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-var-lib-kubelet\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.858323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-netns\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysconfig\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-hostroot\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-socket-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdr5\" (UniqueName: \"kubernetes.io/projected/31c0602b-6394-42d8-b7cc-1a807f7ea065-kube-api-access-6tdr5\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-hostroot\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857763 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/63b43425-e238-4a1e-a63c-4872ab241776-konnectivity-ca\") pod \"konnectivity-agent-2zcz2\" (UID: \"63b43425-e238-4a1e-a63c-4872ab241776\") " pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-run-netns\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857650 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-node-log\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-env-overrides\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-modprobe-d\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-netns\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857873 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-lib-modules\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkh97\" (UniqueName: \"kubernetes.io/projected/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-kube-api-access-gkh97\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80bdebdd-794b-491a-b6b9-8ac831319fea-host-slash\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-log-socket\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysctl-d\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.859124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-k8s-cni-cncf-io\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/80bdebdd-794b-491a-b6b9-8ac831319fea-iptables-alerter-script\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-device-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62zrl\" (UniqueName: \"kubernetes.io/projected/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-kube-api-access-62zrl\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/63b43425-e238-4a1e-a63c-4872ab241776-agent-certs\") pod \"konnectivity-agent-2zcz2\" (UID: \"63b43425-e238-4a1e-a63c-4872ab241776\") " pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-daemon-config\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-log-socket\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qthn\" (UniqueName: \"kubernetes.io/projected/664d5264-1f8a-4986-9272-2e8a718a8923-kube-api-access-6qthn\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.857808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-var-lib-kubelet\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858328 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovnkube-config\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-env-overrides\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-k8s-cni-cncf-io\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-modprobe-d\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-lib-modules\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-sysctl-d\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.860024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-run-netns\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858517 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-node-log\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858563 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-device-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-systemd\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-cni-netd\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovnkube-script-lib\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-system-cni-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-socket-dir-parent\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/664d5264-1f8a-4986-9272-2e8a718a8923-hosts-file\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-registration-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-kubelet\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/80bdebdd-794b-491a-b6b9-8ac831319fea-host-slash\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-slash\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-var-lib-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-host\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-cni-bin\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.860703 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/80bdebdd-794b-491a-b6b9-8ac831319fea-iptables-alerter-script\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/664d5264-1f8a-4986-9272-2e8a718a8923-tmp-dir\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-system-cni-dir\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-os-release\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2v2f\" (UniqueName: \"kubernetes.io/projected/04b9f693-ce4b-4c09-b1df-b3f3382c693c-kube-api-access-r2v2f\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-systemd\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-multus-certs\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-run-multus-certs\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-run-systemd\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-var-lib-openvswitch\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859278 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-host\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859320 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-cni-netd\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-host-var-lib-cni-bin\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859428 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-daemon-config\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.858328 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861228 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/664d5264-1f8a-4986-9272-2e8a718a8923-tmp-dir\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859657 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-kubelet\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/664d5264-1f8a-4986-9272-2e8a718a8923-hosts-file\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-systemd\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/04b9f693-ce4b-4c09-b1df-b3f3382c693c-registration-dir\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-system-cni-dir\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859802 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-host-slash\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31c0602b-6394-42d8-b7cc-1a807f7ea065-os-release\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-system-cni-dir\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859853 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovnkube-script-lib\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859865 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-multus-socket-dir-parent\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.859979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31c0602b-6394-42d8-b7cc-1a807f7ea065-cni-binary-copy\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.860189 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-ovn-node-metrics-cert\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.860298 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c09ee36-d808-4615-bfd1-9a6a361f3a56-tmp\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.860482 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2c09ee36-d808-4615-bfd1-9a6a361f3a56-etc-tuned\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.861724 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.861698 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/63b43425-e238-4a1e-a63c-4872ab241776-agent-certs\") pod \"konnectivity-agent-2zcz2\" (UID: \"63b43425-e238-4a1e-a63c-4872ab241776\") " pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:52.870072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.870056 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klsss\" (UniqueName: \"kubernetes.io/projected/e118d567-f878-439c-b74f-3e060f10ac46-kube-api-access-klsss\") pod \"node-ca-bgw9g\" (UID: \"e118d567-f878-439c-b74f-3e060f10ac46\") " pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:52.873784 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.873760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qthn\" (UniqueName: \"kubernetes.io/projected/664d5264-1f8a-4986-9272-2e8a718a8923-kube-api-access-6qthn\") pod \"node-resolver-mjsxr\" (UID: \"664d5264-1f8a-4986-9272-2e8a718a8923\") " pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:52.874265 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.874244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwkj9\" (UniqueName: \"kubernetes.io/projected/2c09ee36-d808-4615-bfd1-9a6a361f3a56-kube-api-access-fwkj9\") pod \"tuned-stvmb\" (UID: \"2c09ee36-d808-4615-bfd1-9a6a361f3a56\") " pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:52.874533 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.874514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2v2f\" (UniqueName: \"kubernetes.io/projected/04b9f693-ce4b-4c09-b1df-b3f3382c693c-kube-api-access-r2v2f\") pod \"aws-ebs-csi-driver-node-7hzw4\" (UID: \"04b9f693-ce4b-4c09-b1df-b3f3382c693c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:52.874970 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.874950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdr5\" (UniqueName: \"kubernetes.io/projected/31c0602b-6394-42d8-b7cc-1a807f7ea065-kube-api-access-6tdr5\") pod \"multus-additional-cni-plugins-n6k9b\" (UID: \"31c0602b-6394-42d8-b7cc-1a807f7ea065\") " pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:52.875303 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.875288 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkh97\" (UniqueName: \"kubernetes.io/projected/1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e-kube-api-access-gkh97\") pod \"multus-76v98\" (UID: \"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e\") " pod="openshift-multus/multus-76v98" Apr 24 22:29:52.875653 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.875630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zrl\" (UniqueName: \"kubernetes.io/projected/dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402-kube-api-access-62zrl\") pod \"ovnkube-node-46t57\" (UID: \"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402\") " pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:52.875839 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.875824 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8kw\" (UniqueName: \"kubernetes.io/projected/80bdebdd-794b-491a-b6b9-8ac831319fea-kube-api-access-dd8kw\") pod \"iptables-alerter-dwv7l\" (UID: \"80bdebdd-794b-491a-b6b9-8ac831319fea\") " pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:52.959678 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.959646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:52.959678 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.959680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:52.959875 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.959699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgk\" (UniqueName: \"kubernetes.io/projected/12c9d9cf-479c-46fd-9333-94213f4ff2f0-kube-api-access-zwvgk\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:52.959875 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.959789 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:52.959875 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.959871 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:53.459841386 +0000 UTC m=+2.156464219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:52.973191 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.973167 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:52.973191 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.973188 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:52.973351 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.973201 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:52.973351 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:52.973269 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:53.473249516 +0000 UTC m=+2.169872362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:52.977137 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:52.977120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwvgk\" (UniqueName: \"kubernetes.io/projected/12c9d9cf-479c-46fd-9333-94213f4ff2f0-kube-api-access-zwvgk\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:53.052929 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.052907 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dwv7l" Apr 24 22:29:53.059433 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.059414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bgw9g" Apr 24 22:29:53.059792 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.059763 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80bdebdd_794b_491a_b6b9_8ac831319fea.slice/crio-6a336cf53b4a22e07356384ccba9d122d4e7e914caeadee010061954b2e67179 WatchSource:0}: Error finding container 6a336cf53b4a22e07356384ccba9d122d4e7e914caeadee010061954b2e67179: Status 404 returned error can't find the container with id 6a336cf53b4a22e07356384ccba9d122d4e7e914caeadee010061954b2e67179 Apr 24 22:29:53.067460 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.067427 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:29:53.067691 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.067665 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode118d567_f878_439c_b74f_3e060f10ac46.slice/crio-890f496a123df2049feed06f8dc5c2d0e70cdbdf4ef9d9d7e0c319cf0ef89926 WatchSource:0}: Error finding container 890f496a123df2049feed06f8dc5c2d0e70cdbdf4ef9d9d7e0c319cf0ef89926: Status 404 returned error can't find the container with id 890f496a123df2049feed06f8dc5c2d0e70cdbdf4ef9d9d7e0c319cf0ef89926 Apr 24 22:29:53.071944 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.071925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-stvmb" Apr 24 22:29:53.075890 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.075867 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc3ddf9_fec3_48d5_8871_ba8f9b5c3402.slice/crio-48788f9b5f2790d26898d801ca2b9e974b9bf4d3ac33bda87c28119faaad04dc WatchSource:0}: Error finding container 48788f9b5f2790d26898d801ca2b9e974b9bf4d3ac33bda87c28119faaad04dc: Status 404 returned error can't find the container with id 48788f9b5f2790d26898d801ca2b9e974b9bf4d3ac33bda87c28119faaad04dc Apr 24 22:29:53.077446 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.077426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mjsxr" Apr 24 22:29:53.082213 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.082191 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c09ee36_d808_4615_bfd1_9a6a361f3a56.slice/crio-c755a757af9203fd13072b29913aac86bcd67ca3734871520de1aa4294746d46 WatchSource:0}: Error finding container c755a757af9203fd13072b29913aac86bcd67ca3734871520de1aa4294746d46: Status 404 returned error can't find the container with id c755a757af9203fd13072b29913aac86bcd67ca3734871520de1aa4294746d46 Apr 24 22:29:53.083625 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.083606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" Apr 24 22:29:53.085999 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.085727 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod664d5264_1f8a_4986_9272_2e8a718a8923.slice/crio-a91d626d88c4f7412887048aacda3af1ac76f6c62e2ac232497d863db1cdf08a WatchSource:0}: Error finding container a91d626d88c4f7412887048aacda3af1ac76f6c62e2ac232497d863db1cdf08a: Status 404 returned error can't find the container with id a91d626d88c4f7412887048aacda3af1ac76f6c62e2ac232497d863db1cdf08a Apr 24 22:29:53.088590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.088571 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:29:53.092384 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.092364 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c0602b_6394_42d8_b7cc_1a807f7ea065.slice/crio-575fae66f393871997d2ec582b5592870955556aa5af16cc9796e8a0026c3626 WatchSource:0}: Error finding container 575fae66f393871997d2ec582b5592870955556aa5af16cc9796e8a0026c3626: Status 404 returned error can't find the container with id 575fae66f393871997d2ec582b5592870955556aa5af16cc9796e8a0026c3626 Apr 24 22:29:53.095967 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.095950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" Apr 24 22:29:53.097117 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.097091 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b43425_e238_4a1e_a63c_4872ab241776.slice/crio-866ffda918922b433191f52cd5fce8980a1ad4966b47b188f92d70fed0deee0b WatchSource:0}: Error finding container 866ffda918922b433191f52cd5fce8980a1ad4966b47b188f92d70fed0deee0b: Status 404 returned error can't find the container with id 866ffda918922b433191f52cd5fce8980a1ad4966b47b188f92d70fed0deee0b Apr 24 22:29:53.100345 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.100302 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-76v98" Apr 24 22:29:53.105653 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.105626 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b9f693_ce4b_4c09_b1df_b3f3382c693c.slice/crio-5f0b23264c375e0aad9dcbe5bfb21eefb5f71f21f41ae7ed35daa2251817ee59 WatchSource:0}: Error finding container 5f0b23264c375e0aad9dcbe5bfb21eefb5f71f21f41ae7ed35daa2251817ee59: Status 404 returned error can't find the container with id 5f0b23264c375e0aad9dcbe5bfb21eefb5f71f21f41ae7ed35daa2251817ee59 Apr 24 22:29:53.108650 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:29:53.108628 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b7f84b7_e2ee_446a_9a43_d262fa8dfe1e.slice/crio-0dd543b69ed83f1202ea965dad4973bdb98501ba2f4c1ee25641642a2671d552 WatchSource:0}: Error finding container 0dd543b69ed83f1202ea965dad4973bdb98501ba2f4c1ee25641642a2671d552: Status 404 returned error can't find the container with id 0dd543b69ed83f1202ea965dad4973bdb98501ba2f4c1ee25641642a2671d552 Apr 24 22:29:53.206683 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.206657 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:53.308820 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.308663 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:53.463653 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.463616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:53.463823 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:53.463776 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:53.463892 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:53.463838 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:54.46381999 +0000 UTC m=+3.160442825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:53.564679 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.564643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:53.564843 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:53.564812 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:53.564843 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:53.564832 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:53.564951 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:53.564845 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:53.564951 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:53.564905 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:54.564886773 +0000 UTC m=+3.261509601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:53.803405 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.803028 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:52 +0000 UTC" deadline="2028-01-12 14:51:10.069951964 +0000 UTC" Apr 24 22:29:53.803405 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.803306 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15064h21m16.266654068s" Apr 24 22:29:53.859699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.859653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bgw9g" event={"ID":"e118d567-f878-439c-b74f-3e060f10ac46","Type":"ContainerStarted","Data":"890f496a123df2049feed06f8dc5c2d0e70cdbdf4ef9d9d7e0c319cf0ef89926"} Apr 24 22:29:53.870416 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.870363 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwv7l" event={"ID":"80bdebdd-794b-491a-b6b9-8ac831319fea","Type":"ContainerStarted","Data":"6a336cf53b4a22e07356384ccba9d122d4e7e914caeadee010061954b2e67179"} Apr 24 22:29:53.877768 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.877686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-76v98" event={"ID":"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e","Type":"ContainerStarted","Data":"0dd543b69ed83f1202ea965dad4973bdb98501ba2f4c1ee25641642a2671d552"} Apr 24 22:29:53.896993 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.896964 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2zcz2" event={"ID":"63b43425-e238-4a1e-a63c-4872ab241776","Type":"ContainerStarted","Data":"866ffda918922b433191f52cd5fce8980a1ad4966b47b188f92d70fed0deee0b"} Apr 24 22:29:53.902267 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.902242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerStarted","Data":"575fae66f393871997d2ec582b5592870955556aa5af16cc9796e8a0026c3626"} Apr 24 22:29:53.918133 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.918106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mjsxr" event={"ID":"664d5264-1f8a-4986-9272-2e8a718a8923","Type":"ContainerStarted","Data":"a91d626d88c4f7412887048aacda3af1ac76f6c62e2ac232497d863db1cdf08a"} Apr 24 22:29:53.921968 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.921936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-stvmb" event={"ID":"2c09ee36-d808-4615-bfd1-9a6a361f3a56","Type":"ContainerStarted","Data":"c755a757af9203fd13072b29913aac86bcd67ca3734871520de1aa4294746d46"} Apr 24 22:29:53.927721 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.927695 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" event={"ID":"04b9f693-ce4b-4c09-b1df-b3f3382c693c","Type":"ContainerStarted","Data":"5f0b23264c375e0aad9dcbe5bfb21eefb5f71f21f41ae7ed35daa2251817ee59"} Apr 24 22:29:53.949921 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:53.949896 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"48788f9b5f2790d26898d801ca2b9e974b9bf4d3ac33bda87c28119faaad04dc"} Apr 24 22:29:54.163016 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.159671 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:54.471859 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.471203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:54.471859 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.471389 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:54.471859 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.471451 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.471430811 +0000 UTC m=+5.168053643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:54.572885 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.572309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:54.572885 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.572458 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:54.572885 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.572476 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:54.572885 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.572487 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:54.572885 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.572551 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:56.572532559 +0000 UTC m=+5.269155403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:54.804030 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.803940 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:52 +0000 UTC" deadline="2027-12-11 20:00:46.268254886 +0000 UTC" Apr 24 22:29:54.804030 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.803985 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14301h30m51.464274156s" Apr 24 22:29:54.849395 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.849355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:54.849572 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.849542 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:29:54.849671 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:54.849658 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:54.849858 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:54.849829 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:29:56.489298 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:56.489266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:56.489690 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.489398 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.489690 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.489465 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.489445713 +0000 UTC m=+9.186068553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:56.590580 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:56.590545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:56.590740 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.590710 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:56.590740 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.590733 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:56.590835 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.590747 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.590835 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.590807 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:00.590788802 +0000 UTC m=+9.287411646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:56.848851 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:56.848767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:56.849007 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:56.848767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:56.849007 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.848904 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:29:56.849007 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:56.848944 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:29:58.849232 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:58.849167 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:29:58.849651 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:58.849279 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:29:58.849651 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:29:58.849600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:29:58.849768 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:29:58.849701 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:00.521399 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:00.521362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:00.521845 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.521528 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:00.521845 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.521591 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:08.521574506 +0000 UTC m=+17.218197341 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:00.621784 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:00.621737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:00.622008 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.621888 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:00.622008 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.621910 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:00.622008 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.621925 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:00.622008 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.621989 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:08.621971096 +0000 UTC m=+17.318593938 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:00.849270 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:00.849195 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:00.849560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:00.849195 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:00.849560 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.849337 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:00.849560 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:00.849449 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:02.849374 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:02.849341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:02.849772 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:02.849341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:02.849772 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:02.849478 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:02.849772 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:02.849541 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:04.848770 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:04.848738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:04.849196 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:04.848739 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:04.849196 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:04.848863 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:04.849196 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:04.848941 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:06.849042 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:06.849016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:06.849380 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:06.849016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:06.849380 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:06.849129 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:06.849380 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:06.849250 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:08.586029 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:08.585998 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:08.586567 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.586130 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:08.586567 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.586202 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.586188725 +0000 UTC m=+33.282811561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:08.686460 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:08.686418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:08.686632 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.686536 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:08.686632 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.686549 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:08.686632 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.686560 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:08.686632 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.686610 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.686596377 +0000 UTC m=+33.383219219 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:08.849171 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:08.849082 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:08.849333 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:08.849083 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:08.849333 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.849219 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:08.849333 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:08.849274 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:10.848634 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:10.848600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:10.849064 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:10.848700 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:10.849064 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:10.848610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:10.849064 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:10.848838 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:11.982507 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982218 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982708 2571 generic.go:358] "Generic (PLEG): container finished" podID="dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402" containerID="e3cd05fe7efbd118759d5a07b7f2a176d07e0fe7677c10585d9938bc2ee5fd04" exitCode=1 Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982762 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"c156a0f980c3c993c94d9666df04377d16f760e9fd0811e47373c9744ce55bed"} Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982778 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"a9db8bcc7df597e5f6a1b091c52ad470ea4a56e2137f4171ea48c43289f31d2d"} Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"7a0cd8e43616073e9a09f12dbdf14c0cf8dec1ce523201e98bba5a753b21651e"} Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982796 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"72cce6a4f9ed0b4689613029d598b71e641f1b10e4fb9a88aed1ab71a7b66a85"} Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982804 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerDied","Data":"e3cd05fe7efbd118759d5a07b7f2a176d07e0fe7677c10585d9938bc2ee5fd04"} Apr 24 22:30:11.983203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.982813 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"8125d24f255696c96ae05b9120315ae016362dd2c3325bdc42369cae3c15aa49"} Apr 24 22:30:11.983941 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.983916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bgw9g" event={"ID":"e118d567-f878-439c-b74f-3e060f10ac46","Type":"ContainerStarted","Data":"21a1eb694d0e26f20f882b56316b57a8efe338aee228fdd799e8efa8e4cffca1"} Apr 24 22:30:11.985164 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.985135 2571 generic.go:358] "Generic (PLEG): container finished" podID="d104bb52543cda5d9dce608c30af4ba4" containerID="0099cd215107fd4078f314619424f5bfda1097d266112298244ad35b17962dcc" exitCode=0 Apr 24 22:30:11.985257 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.985174 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" event={"ID":"d104bb52543cda5d9dce608c30af4ba4","Type":"ContainerDied","Data":"0099cd215107fd4078f314619424f5bfda1097d266112298244ad35b17962dcc"} Apr 24 22:30:11.986456 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.986439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" event={"ID":"05a6b818b62eb67d03d54038b125e714","Type":"ContainerStarted","Data":"f7a8aed7b452935ee17b9cdc672733fed149914d9f750910f6748243240e07e6"} Apr 24 22:30:11.987733 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.987708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-76v98" event={"ID":"1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e","Type":"ContainerStarted","Data":"4d6db4e1e47ca980e162d7ab9d79be67573e54c84acc2da5fa18235e1e1b4ade"} Apr 24 22:30:11.988996 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.988894 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2zcz2" event={"ID":"63b43425-e238-4a1e-a63c-4872ab241776","Type":"ContainerStarted","Data":"d869581352f5ac30ea2b4ea0587af69c6fe766f187501f1ff57ff871ecfe4407"} Apr 24 22:30:11.990138 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.990117 2571 generic.go:358] "Generic (PLEG): container finished" podID="31c0602b-6394-42d8-b7cc-1a807f7ea065" containerID="cc922143a3ee51ac37adaf227d9d87385d2befe1c609cf0ab1c819afb45524d8" exitCode=0 Apr 24 22:30:11.990235 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.990183 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerDied","Data":"cc922143a3ee51ac37adaf227d9d87385d2befe1c609cf0ab1c819afb45524d8"} Apr 24 22:30:11.991474 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.991454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mjsxr" event={"ID":"664d5264-1f8a-4986-9272-2e8a718a8923","Type":"ContainerStarted","Data":"973340aa52065a58cb95a1a31f41e4dd3e965d867d6fd5fd263878c705495bdd"} Apr 24 22:30:11.992619 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.992602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-stvmb" event={"ID":"2c09ee36-d808-4615-bfd1-9a6a361f3a56","Type":"ContainerStarted","Data":"4699707edda23d824edfafd39901cd8a0f0c749278d00e02fac4683165022a2e"} Apr 24 22:30:11.993741 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:11.993722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" event={"ID":"04b9f693-ce4b-4c09-b1df-b3f3382c693c","Type":"ContainerStarted","Data":"964511b3e3e8da84f51fa5c24ad1a8baba77df083395d31c6e2bd905240f7b25"} Apr 24 22:30:12.023460 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.023409 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bgw9g" podStartSLOduration=3.0884893350000002 podStartE2EDuration="21.023394802s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.070536273 +0000 UTC m=+1.767159102" lastFinishedPulling="2026-04-24 22:30:11.005441724 +0000 UTC m=+19.702064569" observedRunningTime="2026-04-24 22:30:12.006483694 +0000 UTC m=+20.703106545" watchObservedRunningTime="2026-04-24 22:30:12.023394802 +0000 UTC m=+20.720017653" Apr 24 22:30:12.071642 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.071598 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-76v98" podStartSLOduration=2.175073967 podStartE2EDuration="20.071583119s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.110829283 +0000 UTC m=+1.807452115" lastFinishedPulling="2026-04-24 22:30:11.007338424 +0000 UTC m=+19.703961267" observedRunningTime="2026-04-24 22:30:12.071197705 +0000 UTC m=+20.767820552" watchObservedRunningTime="2026-04-24 22:30:12.071583119 +0000 UTC m=+20.768205970" Apr 24 22:30:12.086770 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.086725 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-9.ec2.internal" podStartSLOduration=20.08671107 podStartE2EDuration="20.08671107s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:12.086620962 +0000 UTC m=+20.783243813" watchObservedRunningTime="2026-04-24 22:30:12.08671107 +0000 UTC m=+20.783333924" Apr 24 22:30:12.103420 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.103379 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-stvmb" podStartSLOduration=2.1811480420000002 podStartE2EDuration="20.103368558s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.085374008 +0000 UTC m=+1.781996836" lastFinishedPulling="2026-04-24 22:30:11.00759452 +0000 UTC m=+19.704217352" observedRunningTime="2026-04-24 22:30:12.102964725 +0000 UTC m=+20.799587577" watchObservedRunningTime="2026-04-24 22:30:12.103368558 +0000 UTC m=+20.799991408" Apr 24 22:30:12.123304 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.123259 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mjsxr" podStartSLOduration=2.204745716 podStartE2EDuration="20.123246166s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.088440753 +0000 UTC m=+1.785063599" lastFinishedPulling="2026-04-24 22:30:11.006941204 +0000 UTC m=+19.703564049" observedRunningTime="2026-04-24 22:30:12.12320133 +0000 UTC m=+20.819824195" watchObservedRunningTime="2026-04-24 22:30:12.123246166 +0000 UTC m=+20.819869036" Apr 24 22:30:12.537745 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.537720 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:12.820546 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.820416 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:12.537740355Z","UUID":"5d112f73-f7fc-481c-bc7b-fc1a1801bd6e","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:12.822161 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.822137 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:12.822281 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.822184 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:12.848494 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.848466 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:12.848632 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:12.848578 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:12.848632 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.848591 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:12.848787 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:12.848694 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:12.997571 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.997530 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dwv7l" event={"ID":"80bdebdd-794b-491a-b6b9-8ac831319fea","Type":"ContainerStarted","Data":"dc88de55b5a3ac50908d431cba4c4d770586262001752c39afdf642b1ce35f82"} Apr 24 22:30:12.999815 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:12.999785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" event={"ID":"d104bb52543cda5d9dce608c30af4ba4","Type":"ContainerStarted","Data":"d3d8abc0de8bfc29fcfd980bfaa58e2c9866701917b2277a36d27b2c06f92aec"} Apr 24 22:30:13.002030 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.001998 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" event={"ID":"04b9f693-ce4b-4c09-b1df-b3f3382c693c","Type":"ContainerStarted","Data":"6c06232f32412a05dd80ebd254a965eb33d9d73111ff22a1c02529ea5d38a2ad"} Apr 24 22:30:13.011819 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.011784 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2zcz2" podStartSLOduration=12.46565255 podStartE2EDuration="21.01177114s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.100304518 +0000 UTC m=+1.796927347" lastFinishedPulling="2026-04-24 22:30:01.646423094 +0000 UTC m=+10.343045937" observedRunningTime="2026-04-24 22:30:12.137931339 +0000 UTC m=+20.834554188" watchObservedRunningTime="2026-04-24 22:30:13.01177114 +0000 UTC m=+21.708393990" Apr 24 22:30:13.027801 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.027764 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dwv7l" podStartSLOduration=4.083937753 podStartE2EDuration="22.027749979s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.061660285 +0000 UTC m=+1.758283117" lastFinishedPulling="2026-04-24 22:30:11.005472503 +0000 UTC m=+19.702095343" observedRunningTime="2026-04-24 22:30:13.012211069 +0000 UTC m=+21.708833914" watchObservedRunningTime="2026-04-24 22:30:13.027749979 +0000 UTC m=+21.724372830" Apr 24 22:30:13.028230 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.028193 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-9.ec2.internal" podStartSLOduration=21.028182498 podStartE2EDuration="21.028182498s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:13.027666834 +0000 UTC m=+21.724289684" watchObservedRunningTime="2026-04-24 22:30:13.028182498 +0000 UTC m=+21.724805349" Apr 24 22:30:13.563170 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.563135 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8cmm2"] Apr 24 22:30:13.566037 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.566019 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.566168 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:13.566087 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:13.624372 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.624349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/46cdb586-7bdd-41d4-9d74-7e99334be435-kubelet-config\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.624489 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.624378 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.624489 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.624447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/46cdb586-7bdd-41d4-9d74-7e99334be435-dbus\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.725629 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.725597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/46cdb586-7bdd-41d4-9d74-7e99334be435-dbus\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.725776 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.725669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/46cdb586-7bdd-41d4-9d74-7e99334be435-kubelet-config\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.725776 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.725695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.725776 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.725763 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/46cdb586-7bdd-41d4-9d74-7e99334be435-dbus\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.725894 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:13.725836 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:13.725894 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:13.725854 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/46cdb586-7bdd-41d4-9d74-7e99334be435-kubelet-config\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:13.725894 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:13.725892 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret podName:46cdb586-7bdd-41d4-9d74-7e99334be435 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.225874516 +0000 UTC m=+22.922497359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret") pod "global-pull-secret-syncer-8cmm2" (UID: "46cdb586-7bdd-41d4-9d74-7e99334be435") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:14.005616 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.005564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" event={"ID":"04b9f693-ce4b-4c09-b1df-b3f3382c693c","Type":"ContainerStarted","Data":"df1b0589e060843eebee16dd74ed133ab5573e2d5ecc6b3391cc99185c6f1f52"} Apr 24 22:30:14.008849 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.008819 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:30:14.009223 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.009194 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"76e5ae4f2a537ece2d4fa3f122dd0678d05b911ec79a403fb1f45bba7eb82c77"} Apr 24 22:30:14.230347 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.230313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:14.230507 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:14.230438 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:14.230507 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:14.230493 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret podName:46cdb586-7bdd-41d4-9d74-7e99334be435 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:15.230479445 +0000 UTC m=+23.927102274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret") pod "global-pull-secret-syncer-8cmm2" (UID: "46cdb586-7bdd-41d4-9d74-7e99334be435") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:14.849501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.849321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:14.849678 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.849321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:14.849678 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:14.849607 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:14.849678 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:14.849328 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:14.849806 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:14.849689 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:14.849806 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:14.849789 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:15.238773 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:15.238740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:15.239175 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:15.238886 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:15.239175 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:15.238947 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret podName:46cdb586-7bdd-41d4-9d74-7e99334be435 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:17.238933328 +0000 UTC m=+25.935556156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret") pod "global-pull-secret-syncer-8cmm2" (UID: "46cdb586-7bdd-41d4-9d74-7e99334be435") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:16.311684 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:16.311559 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:30:16.312261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:16.312067 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:30:16.339345 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:16.339290 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7hzw4" podStartSLOduration=4.18232797 podStartE2EDuration="24.339269959s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.107696249 +0000 UTC m=+1.804319078" lastFinishedPulling="2026-04-24 22:30:13.264638227 +0000 UTC m=+21.961261067" observedRunningTime="2026-04-24 22:30:14.057681974 +0000 UTC m=+22.754304826" watchObservedRunningTime="2026-04-24 22:30:16.339269959 +0000 UTC m=+25.035892811" Apr 24 22:30:16.848439 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:16.848403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:16.848663 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:16.848497 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:16.848663 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:16.848419 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:16.848663 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:16.848556 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:16.848663 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:16.848412 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:16.848663 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:16.848621 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:17.016980 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.016950 2571 generic.go:358] "Generic (PLEG): container finished" podID="31c0602b-6394-42d8-b7cc-1a807f7ea065" containerID="8ab5d2c43897c34c0164e27e730644884c57aaf0df13769ed040f5c20926e37a" exitCode=0 Apr 24 22:30:17.017130 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.017027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerDied","Data":"8ab5d2c43897c34c0164e27e730644884c57aaf0df13769ed040f5c20926e37a"} Apr 24 22:30:17.020144 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.020124 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:30:17.020484 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.020464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"a975424182782d8fc1bdbdb8a845fa43fe7601a9e778bc3ee19ced6af02d38c1"} Apr 24 22:30:17.020752 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.020696 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:30:17.020886 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.020867 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:30:17.020966 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.020896 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:30:17.021091 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.021071 2571 scope.go:117] "RemoveContainer" containerID="e3cd05fe7efbd118759d5a07b7f2a176d07e0fe7677c10585d9938bc2ee5fd04" Apr 24 22:30:17.021204 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.021103 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2zcz2" Apr 24 22:30:17.035299 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.035282 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:30:17.035851 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.035837 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:30:17.254403 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.254356 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:17.254597 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:17.254484 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:17.254597 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:17.254565 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret podName:46cdb586-7bdd-41d4-9d74-7e99334be435 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:21.25454496 +0000 UTC m=+29.951167796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret") pod "global-pull-secret-syncer-8cmm2" (UID: "46cdb586-7bdd-41d4-9d74-7e99334be435") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:17.971972 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.971913 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tphln"] Apr 24 22:30:17.972372 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.972017 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:17.972372 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:17.972097 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:17.977661 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.977633 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ngpww"] Apr 24 22:30:17.977790 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.977777 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:17.977898 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:17.977876 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:17.978304 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.978279 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8cmm2"] Apr 24 22:30:17.978418 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:17.978377 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:17.978478 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:17.978448 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:18.024341 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:18.024316 2571 generic.go:358] "Generic (PLEG): container finished" podID="31c0602b-6394-42d8-b7cc-1a807f7ea065" containerID="f8ade1ec5141efcfb5f37416442e6b8c79e0fb7304e92adaa4142867b9dfb194" exitCode=0 Apr 24 22:30:18.024451 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:18.024385 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerDied","Data":"f8ade1ec5141efcfb5f37416442e6b8c79e0fb7304e92adaa4142867b9dfb194"} Apr 24 22:30:18.027707 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:18.027692 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:30:18.028034 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:18.028015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" event={"ID":"dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402","Type":"ContainerStarted","Data":"86003257545c488ea06aa5a34a2ac7751355ec90654a4d66bf6c442270fc0a37"} Apr 24 22:30:18.028120 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:18.028105 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 22:30:18.098806 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:18.098784 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:30:19.031302 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:19.031234 2571 generic.go:358] "Generic (PLEG): container finished" podID="31c0602b-6394-42d8-b7cc-1a807f7ea065" containerID="84829775b63b25c55ba90ea7dcc5a1a2516cb706f0b67c092a398007f2246380" exitCode=0 Apr 24 22:30:19.031663 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:19.031317 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerDied","Data":"84829775b63b25c55ba90ea7dcc5a1a2516cb706f0b67c092a398007f2246380"} Apr 24 22:30:19.076957 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:19.076913 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" podStartSLOduration=9.058575029 podStartE2EDuration="27.07690124s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.078254246 +0000 UTC m=+1.774877103" lastFinishedPulling="2026-04-24 22:30:11.096580483 +0000 UTC m=+19.793203314" observedRunningTime="2026-04-24 22:30:18.109842502 +0000 UTC m=+26.806465351" watchObservedRunningTime="2026-04-24 22:30:19.07690124 +0000 UTC m=+27.773524090" Apr 24 22:30:19.852590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:19.852495 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:19.852590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:19.852506 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:19.852789 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:19.852614 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:19.852789 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:19.852505 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:19.852789 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:19.852703 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:19.852789 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:19.852779 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:21.288365 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:21.288093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:21.288783 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:21.288247 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:21.288783 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:21.288467 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret podName:46cdb586-7bdd-41d4-9d74-7e99334be435 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:29.288445419 +0000 UTC m=+37.985068247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret") pod "global-pull-secret-syncer-8cmm2" (UID: "46cdb586-7bdd-41d4-9d74-7e99334be435") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:21.848925 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:21.848879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:21.850934 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:21.850897 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:21.851071 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:21.850950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:21.851071 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:21.850980 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:21.851208 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:21.851079 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:21.851267 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:21.851235 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:23.852579 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:23.852550 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:23.853068 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:23.852550 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:23.853068 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:23.852662 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cmm2" podUID="46cdb586-7bdd-41d4-9d74-7e99334be435" Apr 24 22:30:23.853068 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:23.852554 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:23.853068 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:23.852757 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngpww" podUID="cba3f39b-cb19-416b-a21a-64491aff6ce9" Apr 24 22:30:23.853068 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:23.852841 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:30:24.096181 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.096081 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-9.ec2.internal" event="NodeReady" Apr 24 22:30:24.096345 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.096277 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:24.142701 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.142666 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c8cfb6695-qd2ch"] Apr 24 22:30:24.175832 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.175800 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c8cfb6695-qd2ch"] Apr 24 22:30:24.175832 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.175836 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vb7wv"] Apr 24 22:30:24.176036 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.175955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.178684 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.178658 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:30:24.179104 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.179066 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:30:24.179227 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.179079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:30:24.179227 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.179116 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzwft\"" Apr 24 22:30:24.185908 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.185258 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:30:24.195708 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.195689 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.200240 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.200207 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:24.200348 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.200244 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:24.200348 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.200257 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:24.200530 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.200512 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdthp\"" Apr 24 22:30:24.207032 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.207009 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vb7wv"] Apr 24 22:30:24.281129 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.281097 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dktqj"] Apr 24 22:30:24.298131 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.298107 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.301875 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.301847 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:24.301983 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.301908 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:24.302247 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.302212 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svq4d\"" Apr 24 22:30:24.303295 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.303273 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dktqj"] Apr 24 22:30:24.310313 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-trusted-ca\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310408 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310408 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-bound-sa-token\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310520 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.310520 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6267a99a-1aea-462f-bab4-ce95abd8548d-ca-trust-extracted\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310619 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-installation-pull-secrets\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310619 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvczt\" (UniqueName: \"kubernetes.io/projected/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-kube-api-access-xvczt\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.310619 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xkq9\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-kube-api-access-8xkq9\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310734 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310622 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-image-registry-private-configuration\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.310734 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.310656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-certificates\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412053 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-trusted-ca\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412265 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412265 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412133 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24fpj\" (UniqueName: \"kubernetes.io/projected/859f8212-1b13-42e6-b832-83bafe50547d-kube-api-access-24fpj\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.412265 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-bound-sa-token\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412265 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.412216 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:24.412265 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.412238 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.412303 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.412306 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.912285557 +0000 UTC m=+33.608908405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.412357 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:24.912342395 +0000 UTC m=+33.608965227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6267a99a-1aea-462f-bab4-ce95abd8548d-ca-trust-extracted\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/859f8212-1b13-42e6-b832-83bafe50547d-config-volume\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/859f8212-1b13-42e6-b832-83bafe50547d-tmp-dir\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412490 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-installation-pull-secrets\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvczt\" (UniqueName: \"kubernetes.io/projected/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-kube-api-access-xvczt\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.412947 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.412947 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xkq9\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-kube-api-access-8xkq9\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412947 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-image-registry-private-configuration\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412947 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-certificates\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.412947 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6267a99a-1aea-462f-bab4-ce95abd8548d-ca-trust-extracted\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.413204 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.412961 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-trusted-ca\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.413204 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.413065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-certificates\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.417667 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.417640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-image-registry-private-configuration\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.417773 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.417684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-installation-pull-secrets\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.429198 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.429173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvczt\" (UniqueName: \"kubernetes.io/projected/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-kube-api-access-xvczt\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.432754 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.432734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xkq9\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-kube-api-access-8xkq9\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.434550 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.434530 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-bound-sa-token\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.513061 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.513037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.513172 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.513100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24fpj\" (UniqueName: \"kubernetes.io/projected/859f8212-1b13-42e6-b832-83bafe50547d-kube-api-access-24fpj\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.513221 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.513194 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:24.513256 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.513233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/859f8212-1b13-42e6-b832-83bafe50547d-config-volume\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.513291 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.513253 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:25.013233249 +0000 UTC m=+33.709856093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:24.513291 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.513283 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/859f8212-1b13-42e6-b832-83bafe50547d-tmp-dir\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.513525 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.513509 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/859f8212-1b13-42e6-b832-83bafe50547d-tmp-dir\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.513599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.513586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/859f8212-1b13-42e6-b832-83bafe50547d-config-volume\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.522276 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.522259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24fpj\" (UniqueName: \"kubernetes.io/projected/859f8212-1b13-42e6-b832-83bafe50547d-kube-api-access-24fpj\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:24.613555 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.613533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:24.613678 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.613661 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:24.613724 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.613715 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:56.613702842 +0000 UTC m=+65.310325670 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:24.714209 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.714149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:24.714310 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.714292 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:24.714361 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.714313 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:24.714361 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.714324 2571 projected.go:194] Error preparing data for projected volume kube-api-access-nhf9q for pod openshift-network-diagnostics/network-check-target-ngpww: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:24.714432 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.714364 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q podName:cba3f39b-cb19-416b-a21a-64491aff6ce9 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:56.714352193 +0000 UTC m=+65.410975021 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nhf9q" (UniqueName: "kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q") pod "network-check-target-ngpww" (UID: "cba3f39b-cb19-416b-a21a-64491aff6ce9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:24.915530 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.915504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:24.916014 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:24.915619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:24.916014 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.915741 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:24.916014 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.915758 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:24.916014 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.915809 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:25.915792191 +0000 UTC m=+34.612415032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:24.916276 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.916195 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:24.916276 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:24.916252 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:25.916235636 +0000 UTC m=+34.612858464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:25.016832 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.016808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:25.016961 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.016943 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:25.017008 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.016999 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.01698403 +0000 UTC m=+34.713606858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:25.045826 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.045796 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerStarted","Data":"c1cf71491fe4ab4192f128a1704c3e38b4b8966ce1e36c1dbb138f3021113acf"} Apr 24 22:30:25.851770 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.851719 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:25.851770 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.851735 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:25.851770 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.851735 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:25.856257 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.856223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:25.856390 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.856284 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:25.856531 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.856284 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:25.856672 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.856284 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:25.856746 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.856720 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dndrr\"" Apr 24 22:30:25.856842 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.856284 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dgz7\"" Apr 24 22:30:25.922782 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.922759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:25.923134 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:25.922793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:25.923134 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.922891 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:25.923134 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.922906 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:25.923134 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.922944 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:25.923134 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.922957 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.922940725 +0000 UTC m=+36.619563553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:25.923134 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:25.922979 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:27.922968652 +0000 UTC m=+36.619591479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:26.023349 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:26.023326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:26.023463 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:26.023451 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:26.023511 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:26.023494 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:28.023484194 +0000 UTC m=+36.720107022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:26.049403 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:26.049379 2571 generic.go:358] "Generic (PLEG): container finished" podID="31c0602b-6394-42d8-b7cc-1a807f7ea065" containerID="c1cf71491fe4ab4192f128a1704c3e38b4b8966ce1e36c1dbb138f3021113acf" exitCode=0 Apr 24 22:30:26.049491 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:26.049407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerDied","Data":"c1cf71491fe4ab4192f128a1704c3e38b4b8966ce1e36c1dbb138f3021113acf"} Apr 24 22:30:27.054096 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:27.054035 2571 generic.go:358] "Generic (PLEG): container finished" podID="31c0602b-6394-42d8-b7cc-1a807f7ea065" containerID="292671e7c43251c6fd180a59c9172c0c2f82c73e7d5f960cb7dc3de51640bd1e" exitCode=0 Apr 24 22:30:27.054452 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:27.054117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerDied","Data":"292671e7c43251c6fd180a59c9172c0c2f82c73e7d5f960cb7dc3de51640bd1e"} Apr 24 22:30:27.936659 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:27.936498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:27.936807 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:27.936671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:27.936807 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:27.936637 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:27.936807 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:27.936739 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:27.936807 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:27.936758 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:27.936807 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:27.936796 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:31.936783814 +0000 UTC m=+40.633406643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:27.936807 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:27.936808 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:31.93680218 +0000 UTC m=+40.633425008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:28.037773 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:28.037749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:28.037911 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:28.037836 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:28.037911 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:28.037881 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:32.0378682 +0000 UTC m=+40.734491028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:28.061519 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:28.061494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" event={"ID":"31c0602b-6394-42d8-b7cc-1a807f7ea065","Type":"ContainerStarted","Data":"f8b28907414df971fd57e3cd17dce0947580d478800c665e7a72bd4a527b89a4"} Apr 24 22:30:28.087120 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:28.087078 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n6k9b" podStartSLOduration=4.346005451 podStartE2EDuration="36.087066679s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:29:53.094080608 +0000 UTC m=+1.790703437" lastFinishedPulling="2026-04-24 22:30:24.835141837 +0000 UTC m=+33.531764665" observedRunningTime="2026-04-24 22:30:28.085949072 +0000 UTC m=+36.782571922" watchObservedRunningTime="2026-04-24 22:30:28.087066679 +0000 UTC m=+36.783689529" Apr 24 22:30:29.347361 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:29.347321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:29.350431 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:29.350399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46cdb586-7bdd-41d4-9d74-7e99334be435-original-pull-secret\") pod \"global-pull-secret-syncer-8cmm2\" (UID: \"46cdb586-7bdd-41d4-9d74-7e99334be435\") " pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:29.463443 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:29.463411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cmm2" Apr 24 22:30:29.607941 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:29.607883 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8cmm2"] Apr 24 22:30:29.611470 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:30:29.611445 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46cdb586_7bdd_41d4_9d74_7e99334be435.slice/crio-692341286ea4103658cfae7967b25a1986f1bb105e0689bdcacf1a069490b40d WatchSource:0}: Error finding container 692341286ea4103658cfae7967b25a1986f1bb105e0689bdcacf1a069490b40d: Status 404 returned error can't find the container with id 692341286ea4103658cfae7967b25a1986f1bb105e0689bdcacf1a069490b40d Apr 24 22:30:30.066264 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:30.066227 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8cmm2" event={"ID":"46cdb586-7bdd-41d4-9d74-7e99334be435","Type":"ContainerStarted","Data":"692341286ea4103658cfae7967b25a1986f1bb105e0689bdcacf1a069490b40d"} Apr 24 22:30:31.283627 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.283582 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k"] Apr 24 22:30:31.305842 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.305812 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x"] Apr 24 22:30:31.306003 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.305976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.308049 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.308022 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 22:30:31.309622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.309486 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 22:30:31.309622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.309549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 22:30:31.310819 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.310453 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 22:30:31.317995 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.317974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k"] Apr 24 22:30:31.318108 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.318006 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr"] Apr 24 22:30:31.318752 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.318341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.320234 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.320150 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mptjv\"" Apr 24 22:30:31.322070 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.322052 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 22:30:31.329584 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.329551 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x"] Apr 24 22:30:31.329675 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.329650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.334134 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.334085 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 22:30:31.334134 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.334124 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 22:30:31.334304 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.334227 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 22:30:31.334304 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.334266 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 22:30:31.336925 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.336906 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr"] Apr 24 22:30:31.464395 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5rf\" (UniqueName: \"kubernetes.io/projected/5fbe09f4-5ecf-4342-880c-3bde8f156081-kube-api-access-hr5rf\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.464607 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-ca\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.464607 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.464607 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4tqh\" (UniqueName: \"kubernetes.io/projected/f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc-kube-api-access-r4tqh\") pod \"managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x\" (UID: \"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.464607 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.464607 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464583 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x\" (UID: \"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.464607 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464606 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5fbe09f4-5ecf-4342-880c-3bde8f156081-klusterlet-config\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.464869 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.464869 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464684 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-hub\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.464869 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464723 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj696\" (UniqueName: \"kubernetes.io/projected/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-kube-api-access-wj696\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.464869 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.464737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5fbe09f4-5ecf-4342-880c-3bde8f156081-tmp\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.565923 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.565852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5rf\" (UniqueName: \"kubernetes.io/projected/5fbe09f4-5ecf-4342-880c-3bde8f156081-kube-api-access-hr5rf\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.565923 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.565909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-ca\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.565934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.565959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4tqh\" (UniqueName: \"kubernetes.io/projected/f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc-kube-api-access-r4tqh\") pod \"managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x\" (UID: \"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.565989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x\" (UID: \"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5fbe09f4-5ecf-4342-880c-3bde8f156081-klusterlet-config\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-hub\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj696\" (UniqueName: \"kubernetes.io/projected/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-kube-api-access-wj696\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.566261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566200 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5fbe09f4-5ecf-4342-880c-3bde8f156081-tmp\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.566645 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5fbe09f4-5ecf-4342-880c-3bde8f156081-tmp\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.566922 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.566900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.569461 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.569406 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-ca\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.569461 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.569418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.569622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.569513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.569754 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.569735 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/5fbe09f4-5ecf-4342-880c-3bde8f156081-klusterlet-config\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.570395 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.570375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-hub\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.570458 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.570423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x\" (UID: \"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.575682 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.575652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4tqh\" (UniqueName: \"kubernetes.io/projected/f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc-kube-api-access-r4tqh\") pod \"managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x\" (UID: \"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.576205 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.576140 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5rf\" (UniqueName: \"kubernetes.io/projected/5fbe09f4-5ecf-4342-880c-3bde8f156081-kube-api-access-hr5rf\") pod \"klusterlet-addon-workmgr-5877c69d5c-drh4k\" (UID: \"5fbe09f4-5ecf-4342-880c-3bde8f156081\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.577934 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.577912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj696\" (UniqueName: \"kubernetes.io/projected/f4fef7d4-2ad5-4750-b900-bb8ea8141fc5-kube-api-access-wj696\") pod \"cluster-proxy-proxy-agent-84c59c454f-6zwpr\" (UID: \"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.622875 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.622838 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:31.638493 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.638470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" Apr 24 22:30:31.647759 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.647734 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:30:31.792635 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.792579 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k"] Apr 24 22:30:31.795044 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.795020 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x"] Apr 24 22:30:31.797512 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:30:31.797485 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fbe09f4_5ecf_4342_880c_3bde8f156081.slice/crio-05a15901b264bf027b1aa41f6b973e5af1b74028ae6f31a533f50e0c19edefcb WatchSource:0}: Error finding container 05a15901b264bf027b1aa41f6b973e5af1b74028ae6f31a533f50e0c19edefcb: Status 404 returned error can't find the container with id 05a15901b264bf027b1aa41f6b973e5af1b74028ae6f31a533f50e0c19edefcb Apr 24 22:30:31.799521 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:30:31.799497 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf791f0e2_deb9_4c1d_a460_b5cb6fb9d5fc.slice/crio-faeeb84a20aa204b163d2eb092366c8f182eb8d250f4af05518c77ef1ef4ae1f WatchSource:0}: Error finding container faeeb84a20aa204b163d2eb092366c8f182eb8d250f4af05518c77ef1ef4ae1f: Status 404 returned error can't find the container with id faeeb84a20aa204b163d2eb092366c8f182eb8d250f4af05518c77ef1ef4ae1f Apr 24 22:30:31.823508 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.823436 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr"] Apr 24 22:30:31.826898 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:30:31.826874 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4fef7d4_2ad5_4750_b900_bb8ea8141fc5.slice/crio-75ad10e657a7e734534d06d1c8bd4d53ab6b93a10c5788a67961a047fa92a31a WatchSource:0}: Error finding container 75ad10e657a7e734534d06d1c8bd4d53ab6b93a10c5788a67961a047fa92a31a: Status 404 returned error can't find the container with id 75ad10e657a7e734534d06d1c8bd4d53ab6b93a10c5788a67961a047fa92a31a Apr 24 22:30:31.968863 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.968831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:31.969018 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:31.968989 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:31.969086 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:31.969051 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:39.96903608 +0000 UTC m=+48.665658907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:31.969150 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:31.969098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:31.969238 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:31.969223 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:31.969289 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:31.969240 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:31.969339 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:31.969288 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:39.969275315 +0000 UTC m=+48.665898151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:32.069933 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:32.069893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:32.070084 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:32.070066 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:32.070173 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:32.070148 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:40.070126607 +0000 UTC m=+48.766749437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:32.072935 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:32.072907 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" event={"ID":"5fbe09f4-5ecf-4342-880c-3bde8f156081","Type":"ContainerStarted","Data":"05a15901b264bf027b1aa41f6b973e5af1b74028ae6f31a533f50e0c19edefcb"} Apr 24 22:30:32.074055 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:32.073996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" event={"ID":"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5","Type":"ContainerStarted","Data":"75ad10e657a7e734534d06d1c8bd4d53ab6b93a10c5788a67961a047fa92a31a"} Apr 24 22:30:32.075063 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:32.075040 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" event={"ID":"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc","Type":"ContainerStarted","Data":"faeeb84a20aa204b163d2eb092366c8f182eb8d250f4af05518c77ef1ef4ae1f"} Apr 24 22:30:34.084671 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:34.084581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8cmm2" event={"ID":"46cdb586-7bdd-41d4-9d74-7e99334be435","Type":"ContainerStarted","Data":"007df69e29c0d3fe70cd1b7ea1d4daaecbda561ef9e3f7d080ef5341abcfdcfc"} Apr 24 22:30:34.100107 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:34.100052 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8cmm2" podStartSLOduration=17.016356869 podStartE2EDuration="21.100033961s" podCreationTimestamp="2026-04-24 22:30:13 +0000 UTC" firstStartedPulling="2026-04-24 22:30:29.613192552 +0000 UTC m=+38.309815380" lastFinishedPulling="2026-04-24 22:30:33.696869638 +0000 UTC m=+42.393492472" observedRunningTime="2026-04-24 22:30:34.098741487 +0000 UTC m=+42.795364338" watchObservedRunningTime="2026-04-24 22:30:34.100033961 +0000 UTC m=+42.796656812" Apr 24 22:30:40.030569 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.030524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:40.031092 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.030590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:40.031092 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.030681 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:40.031092 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.030705 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:40.031092 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.030724 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:40.031092 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.030771 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:30:56.030753023 +0000 UTC m=+64.727375862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:40.031092 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.030792 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:56.030782902 +0000 UTC m=+64.727405733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:40.096852 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.096817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" event={"ID":"5fbe09f4-5ecf-4342-880c-3bde8f156081","Type":"ContainerStarted","Data":"6f59ca9267e6d2ae7a6f548be9dd6846cced15480b0060a8a17ab46c93672cc8"} Apr 24 22:30:40.097037 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.097017 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:40.098361 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.098332 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" event={"ID":"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5","Type":"ContainerStarted","Data":"28b17775bf319b03f80c1991b19a4f5b60247d6ee12defa96c6da74357082802"} Apr 24 22:30:40.099013 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.098991 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:30:40.099776 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.099752 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" event={"ID":"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc","Type":"ContainerStarted","Data":"815fb05b38b5a0619a71064f0acd16bcf2b8bc6868acf5507e09bdee5a9b3a98"} Apr 24 22:30:40.114451 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.114412 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" podStartSLOduration=1.881556035 podStartE2EDuration="9.11439759s" podCreationTimestamp="2026-04-24 22:30:31 +0000 UTC" firstStartedPulling="2026-04-24 22:30:31.799835149 +0000 UTC m=+40.496457977" lastFinishedPulling="2026-04-24 22:30:39.032676701 +0000 UTC m=+47.729299532" observedRunningTime="2026-04-24 22:30:40.113006775 +0000 UTC m=+48.809629844" watchObservedRunningTime="2026-04-24 22:30:40.11439759 +0000 UTC m=+48.811020441" Apr 24 22:30:40.128754 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.128712 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" podStartSLOduration=1.891358715 podStartE2EDuration="9.128703188s" podCreationTimestamp="2026-04-24 22:30:31 +0000 UTC" firstStartedPulling="2026-04-24 22:30:31.801932891 +0000 UTC m=+40.498555730" lastFinishedPulling="2026-04-24 22:30:39.039277375 +0000 UTC m=+47.735900203" observedRunningTime="2026-04-24 22:30:40.127907109 +0000 UTC m=+48.824529959" watchObservedRunningTime="2026-04-24 22:30:40.128703188 +0000 UTC m=+48.825326037" Apr 24 22:30:40.131040 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:40.131021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:40.131204 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.131185 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:40.131288 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:40.131245 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:30:56.131233491 +0000 UTC m=+64.827856319 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:42.106027 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:42.105990 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" event={"ID":"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5","Type":"ContainerStarted","Data":"0fe1839422c23cf132c9fd1a0160199fdf51b5262e5dc079c9a79179777f2dd2"} Apr 24 22:30:42.106027 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:42.106030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" event={"ID":"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5","Type":"ContainerStarted","Data":"dd7d30e349bc65af8c5100971ca56dd64e249558e5cd05ec3a757e1b6ec1bef0"} Apr 24 22:30:42.132530 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:42.132478 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" podStartSLOduration=1.62980901 podStartE2EDuration="11.132462298s" podCreationTimestamp="2026-04-24 22:30:31 +0000 UTC" firstStartedPulling="2026-04-24 22:30:31.828358176 +0000 UTC m=+40.524981008" lastFinishedPulling="2026-04-24 22:30:41.331011467 +0000 UTC m=+50.027634296" observedRunningTime="2026-04-24 22:30:42.132416551 +0000 UTC m=+50.829039403" watchObservedRunningTime="2026-04-24 22:30:42.132462298 +0000 UTC m=+50.829085148" Apr 24 22:30:50.043882 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:50.043851 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46t57" Apr 24 22:30:56.041115 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.041076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:30:56.041115 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.041119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:30:56.041588 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.041239 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:56.041588 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.041247 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:56.041588 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.041271 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:30:56.041588 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.041298 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:31:28.041284575 +0000 UTC m=+96.737907402 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:30:56.041588 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.041335 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:31:28.04131822 +0000 UTC m=+96.737941051 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:30:56.141804 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.141772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:30:56.141961 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.141905 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:56.141961 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.141955 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:31:28.141942424 +0000 UTC m=+96.838565253 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:30:56.643839 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.643804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:30:56.646042 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.646020 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:56.654621 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.654604 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:30:56.654717 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:30:56.654667 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:00.654648125 +0000 UTC m=+129.351270954 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : secret "metrics-daemon-secret" not found Apr 24 22:30:56.744137 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.744104 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:56.746214 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.746196 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:56.756468 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.756453 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:56.767280 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.767261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhf9q\" (UniqueName: \"kubernetes.io/projected/cba3f39b-cb19-416b-a21a-64491aff6ce9-kube-api-access-nhf9q\") pod \"network-check-target-ngpww\" (UID: \"cba3f39b-cb19-416b-a21a-64491aff6ce9\") " pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:56.771167 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.771130 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dndrr\"" Apr 24 22:30:56.779835 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.779818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:30:56.888622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:56.888596 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ngpww"] Apr 24 22:30:56.891065 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:30:56.891033 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba3f39b_cb19_416b_a21a_64491aff6ce9.slice/crio-232c566092eaffbe7c66bd36492dd761e3436dbf701ebb39d11b3a09a16361f8 WatchSource:0}: Error finding container 232c566092eaffbe7c66bd36492dd761e3436dbf701ebb39d11b3a09a16361f8: Status 404 returned error can't find the container with id 232c566092eaffbe7c66bd36492dd761e3436dbf701ebb39d11b3a09a16361f8 Apr 24 22:30:57.142379 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:30:57.142340 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ngpww" event={"ID":"cba3f39b-cb19-416b-a21a-64491aff6ce9","Type":"ContainerStarted","Data":"232c566092eaffbe7c66bd36492dd761e3436dbf701ebb39d11b3a09a16361f8"} Apr 24 22:31:00.151669 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:00.151627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ngpww" event={"ID":"cba3f39b-cb19-416b-a21a-64491aff6ce9","Type":"ContainerStarted","Data":"a4c385643ca4b6d30f505a864f06f8909cee2f373af6a92228949791ae14ab20"} Apr 24 22:31:00.152062 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:00.151774 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:31:00.168818 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:00.168777 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ngpww" podStartSLOduration=65.406264023 podStartE2EDuration="1m8.16876566s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:30:56.892913897 +0000 UTC m=+65.589536725" lastFinishedPulling="2026-04-24 22:30:59.655415534 +0000 UTC m=+68.352038362" observedRunningTime="2026-04-24 22:31:00.168403967 +0000 UTC m=+68.865026818" watchObservedRunningTime="2026-04-24 22:31:00.16876566 +0000 UTC m=+68.865388510" Apr 24 22:31:28.049737 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:28.049700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:31:28.050146 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:28.049744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:31:28.050146 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.049835 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:28.050146 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.049843 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:31:28.050146 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.049861 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c8cfb6695-qd2ch: secret "image-registry-tls" not found Apr 24 22:31:28.050146 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.049884 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert podName:dc34a3cc-7a87-42d1-a0c7-d317f40146bb nodeName:}" failed. No retries permitted until 2026-04-24 22:32:32.049870538 +0000 UTC m=+160.746493379 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert") pod "ingress-canary-vb7wv" (UID: "dc34a3cc-7a87-42d1-a0c7-d317f40146bb") : secret "canary-serving-cert" not found Apr 24 22:31:28.050146 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.049914 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls podName:6267a99a-1aea-462f-bab4-ce95abd8548d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:32.049901357 +0000 UTC m=+160.746524185 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls") pod "image-registry-c8cfb6695-qd2ch" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d") : secret "image-registry-tls" not found Apr 24 22:31:28.150344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:28.150320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:31:28.150466 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.150451 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:28.150515 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:31:28.150505 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls podName:859f8212-1b13-42e6-b832-83bafe50547d nodeName:}" failed. No retries permitted until 2026-04-24 22:32:32.150489605 +0000 UTC m=+160.847112439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls") pod "dns-default-dktqj" (UID: "859f8212-1b13-42e6-b832-83bafe50547d") : secret "dns-default-metrics-tls" not found Apr 24 22:31:31.158272 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:31.158241 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ngpww" Apr 24 22:31:57.083213 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:57.083183 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjsxr_664d5264-1f8a-4986-9272-2e8a718a8923/dns-node-resolver/0.log" Apr 24 22:31:57.670122 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:31:57.670097 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bgw9g_e118d567-f878-439c-b74f-3e060f10ac46/node-ca/0.log" Apr 24 22:32:00.674275 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:00.674232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:32:00.674717 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:00.674389 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:32:00.674717 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:00.674474 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs podName:12c9d9cf-479c-46fd-9333-94213f4ff2f0 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:02.674455068 +0000 UTC m=+251.371077899 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs") pod "network-metrics-daemon-tphln" (UID: "12c9d9cf-479c-46fd-9333-94213f4ff2f0") : secret "metrics-daemon-secret" not found Apr 24 22:32:22.253898 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.253864 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dldk9"] Apr 24 22:32:22.256366 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.256351 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.263985 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.263956 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:32:22.264080 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.263984 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:32:22.264541 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.264524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:32:22.264627 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.264613 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:32:22.264697 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.264663 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dwrxh\"" Apr 24 22:32:22.269170 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.269135 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dldk9"] Apr 24 22:32:22.323124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.323095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85bdc7f6-1a52-4eb1-92b9-d63889497856-crio-socket\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.323124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.323127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bqf5\" (UniqueName: \"kubernetes.io/projected/85bdc7f6-1a52-4eb1-92b9-d63889497856-kube-api-access-8bqf5\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.323284 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.323168 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85bdc7f6-1a52-4eb1-92b9-d63889497856-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.323284 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.323219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85bdc7f6-1a52-4eb1-92b9-d63889497856-data-volume\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.323284 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.323263 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85bdc7f6-1a52-4eb1-92b9-d63889497856-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424526 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85bdc7f6-1a52-4eb1-92b9-d63889497856-crio-socket\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424656 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bqf5\" (UniqueName: \"kubernetes.io/projected/85bdc7f6-1a52-4eb1-92b9-d63889497856-kube-api-access-8bqf5\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424656 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85bdc7f6-1a52-4eb1-92b9-d63889497856-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424656 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85bdc7f6-1a52-4eb1-92b9-d63889497856-data-volume\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424656 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85bdc7f6-1a52-4eb1-92b9-d63889497856-crio-socket\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424866 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85bdc7f6-1a52-4eb1-92b9-d63889497856-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.424967 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.424950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85bdc7f6-1a52-4eb1-92b9-d63889497856-data-volume\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.425194 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.425178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85bdc7f6-1a52-4eb1-92b9-d63889497856-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.426873 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.426855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85bdc7f6-1a52-4eb1-92b9-d63889497856-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.434286 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.434268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bqf5\" (UniqueName: \"kubernetes.io/projected/85bdc7f6-1a52-4eb1-92b9-d63889497856-kube-api-access-8bqf5\") pod \"insights-runtime-extractor-dldk9\" (UID: \"85bdc7f6-1a52-4eb1-92b9-d63889497856\") " pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.564410 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.564357 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dldk9" Apr 24 22:32:22.686090 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:22.686059 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dldk9"] Apr 24 22:32:22.691114 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:32:22.691080 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bdc7f6_1a52_4eb1_92b9_d63889497856.slice/crio-e76dc5b5594c6ff9a5e3be4b73f3a59ac2f451e2e5e9e2218ac733069cc5e790 WatchSource:0}: Error finding container e76dc5b5594c6ff9a5e3be4b73f3a59ac2f451e2e5e9e2218ac733069cc5e790: Status 404 returned error can't find the container with id e76dc5b5594c6ff9a5e3be4b73f3a59ac2f451e2e5e9e2218ac733069cc5e790 Apr 24 22:32:23.345692 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:23.345655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dldk9" event={"ID":"85bdc7f6-1a52-4eb1-92b9-d63889497856","Type":"ContainerStarted","Data":"4543e899056b6cda44a45d4bc40d2b33f2af2159e333ce16d97ddedf08aba7d5"} Apr 24 22:32:23.345692 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:23.345699 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dldk9" event={"ID":"85bdc7f6-1a52-4eb1-92b9-d63889497856","Type":"ContainerStarted","Data":"e76dc5b5594c6ff9a5e3be4b73f3a59ac2f451e2e5e9e2218ac733069cc5e790"} Apr 24 22:32:24.350168 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:24.350126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dldk9" event={"ID":"85bdc7f6-1a52-4eb1-92b9-d63889497856","Type":"ContainerStarted","Data":"d0edcb99368da1e6f089daf46ef0b592febe99f038f433e63c7418edc53d46c5"} Apr 24 22:32:25.354398 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:25.354361 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dldk9" event={"ID":"85bdc7f6-1a52-4eb1-92b9-d63889497856","Type":"ContainerStarted","Data":"1b4e32eb0060223eca916cf029b58aabd407be6619fd42a0ecf3824bd722b812"} Apr 24 22:32:25.372014 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:25.371911 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dldk9" podStartSLOduration=1.267867178 podStartE2EDuration="3.371895616s" podCreationTimestamp="2026-04-24 22:32:22 +0000 UTC" firstStartedPulling="2026-04-24 22:32:22.740452382 +0000 UTC m=+151.437075210" lastFinishedPulling="2026-04-24 22:32:24.844480814 +0000 UTC m=+153.541103648" observedRunningTime="2026-04-24 22:32:25.371821867 +0000 UTC m=+154.068444719" watchObservedRunningTime="2026-04-24 22:32:25.371895616 +0000 UTC m=+154.068518467" Apr 24 22:32:27.189351 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:27.189307 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" podUID="6267a99a-1aea-462f-bab4-ce95abd8548d" Apr 24 22:32:27.204524 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:27.204498 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vb7wv" podUID="dc34a3cc-7a87-42d1-a0c7-d317f40146bb" Apr 24 22:32:27.309132 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:27.309091 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dktqj" podUID="859f8212-1b13-42e6-b832-83bafe50547d" Apr 24 22:32:27.360937 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:27.360908 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:32:27.361049 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:27.360919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dktqj" Apr 24 22:32:28.873972 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:28.873912 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tphln" podUID="12c9d9cf-479c-46fd-9333-94213f4ff2f0" Apr 24 22:32:32.094994 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.094952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:32:32.094994 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.094991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:32:32.097549 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.097525 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc34a3cc-7a87-42d1-a0c7-d317f40146bb-cert\") pod \"ingress-canary-vb7wv\" (UID: \"dc34a3cc-7a87-42d1-a0c7-d317f40146bb\") " pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:32:32.097620 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.097525 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"image-registry-c8cfb6695-qd2ch\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:32:32.163488 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.163463 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdthp\"" Apr 24 22:32:32.171962 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.171947 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vb7wv" Apr 24 22:32:32.196016 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.195990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:32:32.198038 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.198018 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/859f8212-1b13-42e6-b832-83bafe50547d-metrics-tls\") pod \"dns-default-dktqj\" (UID: \"859f8212-1b13-42e6-b832-83bafe50547d\") " pod="openshift-dns/dns-default-dktqj" Apr 24 22:32:32.280989 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.280962 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vb7wv"] Apr 24 22:32:32.284308 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:32:32.284280 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc34a3cc_7a87_42d1_a0c7_d317f40146bb.slice/crio-0f4e28bba2482f47f9bd301b4e964b6bde8564146ed5356b1648c73c4b3f315a WatchSource:0}: Error finding container 0f4e28bba2482f47f9bd301b4e964b6bde8564146ed5356b1648c73c4b3f315a: Status 404 returned error can't find the container with id 0f4e28bba2482f47f9bd301b4e964b6bde8564146ed5356b1648c73c4b3f315a Apr 24 22:32:32.373209 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.373178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vb7wv" event={"ID":"dc34a3cc-7a87-42d1-a0c7-d317f40146bb","Type":"ContainerStarted","Data":"0f4e28bba2482f47f9bd301b4e964b6bde8564146ed5356b1648c73c4b3f315a"} Apr 24 22:32:32.463348 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.463326 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svq4d\"" Apr 24 22:32:32.471895 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.471865 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dktqj" Apr 24 22:32:32.581348 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:32.581321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dktqj"] Apr 24 22:32:32.584552 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:32:32.584519 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod859f8212_1b13_42e6_b832_83bafe50547d.slice/crio-70ababa60f5379882ce2f5b21f236228027831593b28678bb7d4daa674ce9cfd WatchSource:0}: Error finding container 70ababa60f5379882ce2f5b21f236228027831593b28678bb7d4daa674ce9cfd: Status 404 returned error can't find the container with id 70ababa60f5379882ce2f5b21f236228027831593b28678bb7d4daa674ce9cfd Apr 24 22:32:33.376785 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:33.376744 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dktqj" event={"ID":"859f8212-1b13-42e6-b832-83bafe50547d","Type":"ContainerStarted","Data":"70ababa60f5379882ce2f5b21f236228027831593b28678bb7d4daa674ce9cfd"} Apr 24 22:32:34.359325 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.359299 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gnqvl"] Apr 24 22:32:34.362535 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.362513 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.365069 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365005 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:32:34.365435 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365254 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:32:34.365435 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365314 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:32:34.365435 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365366 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8rvqf\"" Apr 24 22:32:34.366857 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365768 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:32:34.366857 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365807 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:32:34.366857 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.365832 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:32:34.515521 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515486 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsk2\" (UniqueName: \"kubernetes.io/projected/c08a0646-f245-4a51-ac74-07a12a5ffe04-kube-api-access-dzsk2\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-accelerators-collector-config\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-root\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515642 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-tls\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-wtmp\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515727 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-sys\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-textfile\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.515913 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.515797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c08a0646-f245-4a51-ac74-07a12a5ffe04-metrics-client-ca\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616293 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsk2\" (UniqueName: \"kubernetes.io/projected/c08a0646-f245-4a51-ac74-07a12a5ffe04-kube-api-access-dzsk2\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616416 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-accelerators-collector-config\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616416 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-root\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616416 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-tls\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616416 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-wtmp\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-sys\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-root\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-textfile\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c08a0646-f245-4a51-ac74-07a12a5ffe04-metrics-client-ca\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:34.616543 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-sys\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616590 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-wtmp\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.616849 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:32:34.616597 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-tls podName:c08a0646-f245-4a51-ac74-07a12a5ffe04 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:35.116578646 +0000 UTC m=+163.813201490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-tls") pod "node-exporter-gnqvl" (UID: "c08a0646-f245-4a51-ac74-07a12a5ffe04") : secret "node-exporter-tls" not found Apr 24 22:32:34.616891 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.616866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-textfile\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.617045 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.617029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c08a0646-f245-4a51-ac74-07a12a5ffe04-metrics-client-ca\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.617111 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.617095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-accelerators-collector-config\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.618645 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.618625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:34.624753 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:34.624731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsk2\" (UniqueName: \"kubernetes.io/projected/c08a0646-f245-4a51-ac74-07a12a5ffe04-kube-api-access-dzsk2\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:35.120599 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.120568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-tls\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:35.122827 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.122807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c08a0646-f245-4a51-ac74-07a12a5ffe04-node-exporter-tls\") pod \"node-exporter-gnqvl\" (UID: \"c08a0646-f245-4a51-ac74-07a12a5ffe04\") " pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:35.275846 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.275820 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gnqvl" Apr 24 22:32:35.284442 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:32:35.284411 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc08a0646_f245_4a51_ac74_07a12a5ffe04.slice/crio-4eed1c34b6f2fac267005d5615ababb732928269e827618f604d994d3b9307c1 WatchSource:0}: Error finding container 4eed1c34b6f2fac267005d5615ababb732928269e827618f604d994d3b9307c1: Status 404 returned error can't find the container with id 4eed1c34b6f2fac267005d5615ababb732928269e827618f604d994d3b9307c1 Apr 24 22:32:35.390571 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.390495 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vb7wv" event={"ID":"dc34a3cc-7a87-42d1-a0c7-d317f40146bb","Type":"ContainerStarted","Data":"b5ba74b5c492fc56130c504cef87fe38aa4d821ef70a4c0dff59abf5202f06f7"} Apr 24 22:32:35.391970 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.391941 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dktqj" event={"ID":"859f8212-1b13-42e6-b832-83bafe50547d","Type":"ContainerStarted","Data":"266436af120922a8f7104160622a93e9b8196bbaba57ba439a94989271992959"} Apr 24 22:32:35.392143 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.391976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dktqj" event={"ID":"859f8212-1b13-42e6-b832-83bafe50547d","Type":"ContainerStarted","Data":"6a96d47264ad604c2759435cd9e899237ba248f884a201fbe87924c761f7edbc"} Apr 24 22:32:35.392143 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.392080 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dktqj" Apr 24 22:32:35.392810 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.392789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gnqvl" event={"ID":"c08a0646-f245-4a51-ac74-07a12a5ffe04","Type":"ContainerStarted","Data":"4eed1c34b6f2fac267005d5615ababb732928269e827618f604d994d3b9307c1"} Apr 24 22:32:35.408448 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:35.408410 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vb7wv" podStartSLOduration=129.374024877 podStartE2EDuration="2m11.408399884s" podCreationTimestamp="2026-04-24 22:30:24 +0000 UTC" firstStartedPulling="2026-04-24 22:32:32.285996578 +0000 UTC m=+160.982619406" lastFinishedPulling="2026-04-24 22:32:34.320371579 +0000 UTC m=+163.016994413" observedRunningTime="2026-04-24 22:32:35.408076374 +0000 UTC m=+164.104699224" watchObservedRunningTime="2026-04-24 22:32:35.408399884 +0000 UTC m=+164.105022734" Apr 24 22:32:36.396971 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:36.396939 2571 generic.go:358] "Generic (PLEG): container finished" podID="c08a0646-f245-4a51-ac74-07a12a5ffe04" containerID="47cfe178483b96d977db8b8c8ca9090b83c89b78fe6310a7c50209ef7d19a659" exitCode=0 Apr 24 22:32:36.397444 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:36.397034 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gnqvl" event={"ID":"c08a0646-f245-4a51-ac74-07a12a5ffe04","Type":"ContainerDied","Data":"47cfe178483b96d977db8b8c8ca9090b83c89b78fe6310a7c50209ef7d19a659"} Apr 24 22:32:36.444771 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:36.444728 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dktqj" podStartSLOduration=130.713852248 podStartE2EDuration="2m12.444709717s" podCreationTimestamp="2026-04-24 22:30:24 +0000 UTC" firstStartedPulling="2026-04-24 22:32:32.586728208 +0000 UTC m=+161.283351035" lastFinishedPulling="2026-04-24 22:32:34.317585659 +0000 UTC m=+163.014208504" observedRunningTime="2026-04-24 22:32:35.429772384 +0000 UTC m=+164.126395234" watchObservedRunningTime="2026-04-24 22:32:36.444709717 +0000 UTC m=+165.141332565" Apr 24 22:32:37.401338 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:37.401300 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gnqvl" event={"ID":"c08a0646-f245-4a51-ac74-07a12a5ffe04","Type":"ContainerStarted","Data":"d1016fbdbafe0a949e0685757ceebe270e1719ac84d3a6376eda43e47a1fa7fa"} Apr 24 22:32:37.401338 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:37.401336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gnqvl" event={"ID":"c08a0646-f245-4a51-ac74-07a12a5ffe04","Type":"ContainerStarted","Data":"604dbc42d1aa3045c5f3383666f15b2af339455e05e12c7a504d81118be822d0"} Apr 24 22:32:37.428354 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:37.428300 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gnqvl" podStartSLOduration=2.7225953929999998 podStartE2EDuration="3.428285933s" podCreationTimestamp="2026-04-24 22:32:34 +0000 UTC" firstStartedPulling="2026-04-24 22:32:35.285987235 +0000 UTC m=+163.982610068" lastFinishedPulling="2026-04-24 22:32:35.991677772 +0000 UTC m=+164.688300608" observedRunningTime="2026-04-24 22:32:37.426868914 +0000 UTC m=+166.123491765" watchObservedRunningTime="2026-04-24 22:32:37.428285933 +0000 UTC m=+166.124908780" Apr 24 22:32:39.408471 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.408439 2571 generic.go:358] "Generic (PLEG): container finished" podID="5fbe09f4-5ecf-4342-880c-3bde8f156081" containerID="6f59ca9267e6d2ae7a6f548be9dd6846cced15480b0060a8a17ab46c93672cc8" exitCode=1 Apr 24 22:32:39.408831 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.408516 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" event={"ID":"5fbe09f4-5ecf-4342-880c-3bde8f156081","Type":"ContainerDied","Data":"6f59ca9267e6d2ae7a6f548be9dd6846cced15480b0060a8a17ab46c93672cc8"} Apr 24 22:32:39.408909 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.408892 2571 scope.go:117] "RemoveContainer" containerID="6f59ca9267e6d2ae7a6f548be9dd6846cced15480b0060a8a17ab46c93672cc8" Apr 24 22:32:39.409860 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.409840 2571 generic.go:358] "Generic (PLEG): container finished" podID="f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc" containerID="815fb05b38b5a0619a71064f0acd16bcf2b8bc6868acf5507e09bdee5a9b3a98" exitCode=255 Apr 24 22:32:39.409946 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.409878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" event={"ID":"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc","Type":"ContainerDied","Data":"815fb05b38b5a0619a71064f0acd16bcf2b8bc6868acf5507e09bdee5a9b3a98"} Apr 24 22:32:39.410116 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.410103 2571 scope.go:117] "RemoveContainer" containerID="815fb05b38b5a0619a71064f0acd16bcf2b8bc6868acf5507e09bdee5a9b3a98" Apr 24 22:32:39.849044 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:39.848962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:32:40.097230 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:40.097201 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:32:40.416836 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:40.416805 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c94ccb6c5-k8m8x" event={"ID":"f791f0e2-deb9-4c1d-a460-b5cb6fb9d5fc","Type":"ContainerStarted","Data":"c702117c68ef92f6b702e0238db864c39f5e2ab3d18cad97f14c055d57a3797f"} Apr 24 22:32:40.418389 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:40.418366 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" event={"ID":"5fbe09f4-5ecf-4342-880c-3bde8f156081","Type":"ContainerStarted","Data":"9184d4f4879b37f67b3a77b0468f13295c1bd1ff2fc22060bccc68eed41fed3d"} Apr 24 22:32:40.418560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:40.418544 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:32:40.419137 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:40.419111 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5877c69d5c-drh4k" Apr 24 22:32:41.849756 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:41.849723 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:32:41.852362 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:41.852335 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzwft\"" Apr 24 22:32:41.860453 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:41.860436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:32:41.972127 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:41.972101 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c8cfb6695-qd2ch"] Apr 24 22:32:41.976146 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:32:41.976121 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6267a99a_1aea_462f_bab4_ce95abd8548d.slice/crio-3f445774e7ed611992a70c92de0553dce0b59544017110d139c3d97e9229c5b1 WatchSource:0}: Error finding container 3f445774e7ed611992a70c92de0553dce0b59544017110d139c3d97e9229c5b1: Status 404 returned error can't find the container with id 3f445774e7ed611992a70c92de0553dce0b59544017110d139c3d97e9229c5b1 Apr 24 22:32:42.425220 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:42.425187 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" event={"ID":"6267a99a-1aea-462f-bab4-ce95abd8548d","Type":"ContainerStarted","Data":"6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33"} Apr 24 22:32:42.425220 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:42.425224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" event={"ID":"6267a99a-1aea-462f-bab4-ce95abd8548d","Type":"ContainerStarted","Data":"3f445774e7ed611992a70c92de0553dce0b59544017110d139c3d97e9229c5b1"} Apr 24 22:32:42.444499 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:42.444456 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" podStartSLOduration=170.444442165 podStartE2EDuration="2m50.444442165s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:42.44336686 +0000 UTC m=+171.139989709" watchObservedRunningTime="2026-04-24 22:32:42.444442165 +0000 UTC m=+171.141065016" Apr 24 22:32:43.430980 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:43.430947 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:32:45.398876 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:45.398848 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dktqj" Apr 24 22:32:54.995739 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:32:54.995705 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c8cfb6695-qd2ch"] Apr 24 22:33:01.104563 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:01.104535 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dktqj_859f8212-1b13-42e6-b832-83bafe50547d/dns/0.log" Apr 24 22:33:01.304411 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:01.304381 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dktqj_859f8212-1b13-42e6-b832-83bafe50547d/kube-rbac-proxy/0.log" Apr 24 22:33:02.504491 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:02.504461 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjsxr_664d5264-1f8a-4986-9272-2e8a718a8923/dns-node-resolver/0.log" Apr 24 22:33:03.504008 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:03.503978 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vb7wv_dc34a3cc-7a87-42d1-a0c7-d317f40146bb/serve-healthcheck-canary/0.log" Apr 24 22:33:05.001501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:05.001470 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:33:20.013893 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.013827 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" podUID="6267a99a-1aea-462f-bab4-ce95abd8548d" containerName="registry" containerID="cri-o://6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33" gracePeriod=30 Apr 24 22:33:20.242417 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.242395 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:33:20.328545 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328484 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6267a99a-1aea-462f-bab4-ce95abd8548d-ca-trust-extracted\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328545 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328523 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328550 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-bound-sa-token\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328585 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-image-registry-private-configuration\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328604 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-installation-pull-secrets\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328631 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-certificates\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328655 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-trusted-ca\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.328699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.328688 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xkq9\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-kube-api-access-8xkq9\") pod \"6267a99a-1aea-462f-bab4-ce95abd8548d\" (UID: \"6267a99a-1aea-462f-bab4-ce95abd8548d\") " Apr 24 22:33:20.329215 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.329121 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:20.329323 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.329241 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:20.331336 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.331304 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:20.331437 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.331342 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:20.331437 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.331412 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:20.331523 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.331480 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:20.331606 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.331581 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-kube-api-access-8xkq9" (OuterVolumeSpecName: "kube-api-access-8xkq9") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "kube-api-access-8xkq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:20.337217 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.337189 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6267a99a-1aea-462f-bab4-ce95abd8548d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6267a99a-1aea-462f-bab4-ce95abd8548d" (UID: "6267a99a-1aea-462f-bab4-ce95abd8548d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:20.429848 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429827 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6267a99a-1aea-462f-bab4-ce95abd8548d-ca-trust-extracted\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429848 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429848 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-tls\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429963 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429858 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-bound-sa-token\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429963 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429866 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-image-registry-private-configuration\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429963 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429875 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6267a99a-1aea-462f-bab4-ce95abd8548d-installation-pull-secrets\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429963 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429885 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-registry-certificates\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429963 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429893 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6267a99a-1aea-462f-bab4-ce95abd8548d-trusted-ca\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.429963 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.429902 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8xkq9\" (UniqueName: \"kubernetes.io/projected/6267a99a-1aea-462f-bab4-ce95abd8548d-kube-api-access-8xkq9\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:33:20.524426 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.524395 2571 generic.go:358] "Generic (PLEG): container finished" podID="6267a99a-1aea-462f-bab4-ce95abd8548d" containerID="6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33" exitCode=0 Apr 24 22:33:20.524535 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.524447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" event={"ID":"6267a99a-1aea-462f-bab4-ce95abd8548d","Type":"ContainerDied","Data":"6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33"} Apr 24 22:33:20.524535 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.524460 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" Apr 24 22:33:20.524535 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.524472 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c8cfb6695-qd2ch" event={"ID":"6267a99a-1aea-462f-bab4-ce95abd8548d","Type":"ContainerDied","Data":"3f445774e7ed611992a70c92de0553dce0b59544017110d139c3d97e9229c5b1"} Apr 24 22:33:20.524535 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.524487 2571 scope.go:117] "RemoveContainer" containerID="6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33" Apr 24 22:33:20.532687 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.532668 2571 scope.go:117] "RemoveContainer" containerID="6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33" Apr 24 22:33:20.532937 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:33:20.532917 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33\": container with ID starting with 6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33 not found: ID does not exist" containerID="6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33" Apr 24 22:33:20.532984 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.532944 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33"} err="failed to get container status \"6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33\": rpc error: code = NotFound desc = could not find container \"6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33\": container with ID starting with 6accfbac5a54fdd67511e591e984d09475ac08f5acdcf939cc86c0377cf01d33 not found: ID does not exist" Apr 24 22:33:20.542805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.542785 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c8cfb6695-qd2ch"] Apr 24 22:33:20.546596 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:20.546576 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c8cfb6695-qd2ch"] Apr 24 22:33:21.648995 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:21.648938 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" podUID="f4fef7d4-2ad5-4750-b900-bb8ea8141fc5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:33:21.852287 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:21.852253 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6267a99a-1aea-462f-bab4-ce95abd8548d" path="/var/lib/kubelet/pods/6267a99a-1aea-462f-bab4-ce95abd8548d/volumes" Apr 24 22:33:31.648983 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:31.648940 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" podUID="f4fef7d4-2ad5-4750-b900-bb8ea8141fc5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:33:41.649397 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:41.649355 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" podUID="f4fef7d4-2ad5-4750-b900-bb8ea8141fc5" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:33:41.649946 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:41.649437 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" Apr 24 22:33:41.650043 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:41.650020 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"0fe1839422c23cf132c9fd1a0160199fdf51b5262e5dc079c9a79179777f2dd2"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 22:33:41.650102 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:41.650073 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" podUID="f4fef7d4-2ad5-4750-b900-bb8ea8141fc5" containerName="service-proxy" containerID="cri-o://0fe1839422c23cf132c9fd1a0160199fdf51b5262e5dc079c9a79179777f2dd2" gracePeriod=30 Apr 24 22:33:42.582940 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:42.582899 2571 generic.go:358] "Generic (PLEG): container finished" podID="f4fef7d4-2ad5-4750-b900-bb8ea8141fc5" containerID="0fe1839422c23cf132c9fd1a0160199fdf51b5262e5dc079c9a79179777f2dd2" exitCode=2 Apr 24 22:33:42.583120 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:42.582963 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" event={"ID":"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5","Type":"ContainerDied","Data":"0fe1839422c23cf132c9fd1a0160199fdf51b5262e5dc079c9a79179777f2dd2"} Apr 24 22:33:42.583120 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:33:42.583006 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84c59c454f-6zwpr" event={"ID":"f4fef7d4-2ad5-4750-b900-bb8ea8141fc5","Type":"ContainerStarted","Data":"3188aa1dce51f57c2b3d37e32bdda7ba50f334f0045d122e46b28c6135a9b8b3"} Apr 24 22:34:02.706247 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:02.706214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:34:02.708377 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:02.708345 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12c9d9cf-479c-46fd-9333-94213f4ff2f0-metrics-certs\") pod \"network-metrics-daemon-tphln\" (UID: \"12c9d9cf-479c-46fd-9333-94213f4ff2f0\") " pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:34:02.951578 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:02.951548 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dgz7\"" Apr 24 22:34:02.960379 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:02.960319 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tphln" Apr 24 22:34:03.069786 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:03.069758 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tphln"] Apr 24 22:34:03.073364 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:34:03.073336 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c9d9cf_479c_46fd_9333_94213f4ff2f0.slice/crio-cdc825b90a926a7b890c6218d7b1c40dd4f21717ac12ac982b7223451e5b4fd1 WatchSource:0}: Error finding container cdc825b90a926a7b890c6218d7b1c40dd4f21717ac12ac982b7223451e5b4fd1: Status 404 returned error can't find the container with id cdc825b90a926a7b890c6218d7b1c40dd4f21717ac12ac982b7223451e5b4fd1 Apr 24 22:34:03.636851 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:03.636822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tphln" event={"ID":"12c9d9cf-479c-46fd-9333-94213f4ff2f0","Type":"ContainerStarted","Data":"cdc825b90a926a7b890c6218d7b1c40dd4f21717ac12ac982b7223451e5b4fd1"} Apr 24 22:34:04.640823 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:04.640785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tphln" event={"ID":"12c9d9cf-479c-46fd-9333-94213f4ff2f0","Type":"ContainerStarted","Data":"1ecda7c09714c9743f0253081ea68207e1667e52d00dde25be4d07f366b27bb2"} Apr 24 22:34:04.640823 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:04.640826 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tphln" event={"ID":"12c9d9cf-479c-46fd-9333-94213f4ff2f0","Type":"ContainerStarted","Data":"167d5266155ea94f68f1c4d4cd0dbdd33ae4e422c0d31e2b4bf17318170a94f3"} Apr 24 22:34:04.658258 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:04.658216 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tphln" podStartSLOduration=251.658919981 podStartE2EDuration="4m12.658203162s" podCreationTimestamp="2026-04-24 22:29:52 +0000 UTC" firstStartedPulling="2026-04-24 22:34:03.075251319 +0000 UTC m=+251.771874147" lastFinishedPulling="2026-04-24 22:34:04.0745345 +0000 UTC m=+252.771157328" observedRunningTime="2026-04-24 22:34:04.657625052 +0000 UTC m=+253.354247902" watchObservedRunningTime="2026-04-24 22:34:04.658203162 +0000 UTC m=+253.354826012" Apr 24 22:34:48.748249 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.748211 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t"] Apr 24 22:34:48.748693 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.748453 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6267a99a-1aea-462f-bab4-ce95abd8548d" containerName="registry" Apr 24 22:34:48.748693 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.748464 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6267a99a-1aea-462f-bab4-ce95abd8548d" containerName="registry" Apr 24 22:34:48.748693 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.748503 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6267a99a-1aea-462f-bab4-ce95abd8548d" containerName="registry" Apr 24 22:34:48.750944 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.750927 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.752948 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.752923 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:34:48.753077 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.752977 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:34:48.753223 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.753206 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bdjkc\"" Apr 24 22:34:48.758556 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.758501 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t"] Apr 24 22:34:48.797408 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.797384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scj76\" (UniqueName: \"kubernetes.io/projected/ac38e846-05ca-4683-b4a3-f5f8eb748c23-kube-api-access-scj76\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.797517 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.797420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.797517 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.797455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.898513 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.898485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.898621 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.898550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scj76\" (UniqueName: \"kubernetes.io/projected/ac38e846-05ca-4683-b4a3-f5f8eb748c23-kube-api-access-scj76\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.898731 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.898712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.898960 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.898937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.899049 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.899005 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:48.906203 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:48.906176 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scj76\" (UniqueName: \"kubernetes.io/projected/ac38e846-05ca-4683-b4a3-f5f8eb748c23-kube-api-access-scj76\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:49.060653 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:49.060592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:34:49.170805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:49.170772 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t"] Apr 24 22:34:49.173951 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:34:49.173919 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac38e846_05ca_4683_b4a3_f5f8eb748c23.slice/crio-248f7ca5bb42245fa434fbe69e4fa2c34bdac53cf4f0bb714412df0a43ae5db8 WatchSource:0}: Error finding container 248f7ca5bb42245fa434fbe69e4fa2c34bdac53cf4f0bb714412df0a43ae5db8: Status 404 returned error can't find the container with id 248f7ca5bb42245fa434fbe69e4fa2c34bdac53cf4f0bb714412df0a43ae5db8 Apr 24 22:34:49.757040 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:49.756987 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" event={"ID":"ac38e846-05ca-4683-b4a3-f5f8eb748c23","Type":"ContainerStarted","Data":"248f7ca5bb42245fa434fbe69e4fa2c34bdac53cf4f0bb714412df0a43ae5db8"} Apr 24 22:34:51.772336 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:51.772304 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:34:51.773014 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:51.772994 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:34:51.777085 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:51.777064 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:34:54.771427 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:54.771356 2571 generic.go:358] "Generic (PLEG): container finished" podID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerID="67b8e3523419a6a146be9e8a22f679c9bdee0d451e388585b98e671aba8956df" exitCode=0 Apr 24 22:34:54.771427 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:54.771394 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" event={"ID":"ac38e846-05ca-4683-b4a3-f5f8eb748c23","Type":"ContainerDied","Data":"67b8e3523419a6a146be9e8a22f679c9bdee0d451e388585b98e671aba8956df"} Apr 24 22:34:54.772349 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:54.772333 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:34:56.778541 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:56.778512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" event={"ID":"ac38e846-05ca-4683-b4a3-f5f8eb748c23","Type":"ContainerStarted","Data":"30339d77f5a05f897a0e08344e1752117f25abef9287c7418c16416f122a6f62"} Apr 24 22:34:57.783408 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:57.783375 2571 generic.go:358] "Generic (PLEG): container finished" podID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerID="30339d77f5a05f897a0e08344e1752117f25abef9287c7418c16416f122a6f62" exitCode=0 Apr 24 22:34:57.783776 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:34:57.783435 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" event={"ID":"ac38e846-05ca-4683-b4a3-f5f8eb748c23","Type":"ContainerDied","Data":"30339d77f5a05f897a0e08344e1752117f25abef9287c7418c16416f122a6f62"} Apr 24 22:35:03.807827 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:03.807748 2571 generic.go:358] "Generic (PLEG): container finished" podID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerID="ec927a53072a9eadd05965e03ce7da6992ade777534d61a433ede246f3e91ca2" exitCode=0 Apr 24 22:35:03.807827 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:03.807812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" event={"ID":"ac38e846-05ca-4683-b4a3-f5f8eb748c23","Type":"ContainerDied","Data":"ec927a53072a9eadd05965e03ce7da6992ade777534d61a433ede246f3e91ca2"} Apr 24 22:35:04.917634 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:04.917610 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:35:05.013072 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.013048 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-util\") pod \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " Apr 24 22:35:05.013209 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.013111 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scj76\" (UniqueName: \"kubernetes.io/projected/ac38e846-05ca-4683-b4a3-f5f8eb748c23-kube-api-access-scj76\") pod \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " Apr 24 22:35:05.013209 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.013134 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-bundle\") pod \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\" (UID: \"ac38e846-05ca-4683-b4a3-f5f8eb748c23\") " Apr 24 22:35:05.013661 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.013642 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-bundle" (OuterVolumeSpecName: "bundle") pod "ac38e846-05ca-4683-b4a3-f5f8eb748c23" (UID: "ac38e846-05ca-4683-b4a3-f5f8eb748c23"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:35:05.015225 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.015202 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac38e846-05ca-4683-b4a3-f5f8eb748c23-kube-api-access-scj76" (OuterVolumeSpecName: "kube-api-access-scj76") pod "ac38e846-05ca-4683-b4a3-f5f8eb748c23" (UID: "ac38e846-05ca-4683-b4a3-f5f8eb748c23"). InnerVolumeSpecName "kube-api-access-scj76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:35:05.016849 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.016826 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-util" (OuterVolumeSpecName: "util") pod "ac38e846-05ca-4683-b4a3-f5f8eb748c23" (UID: "ac38e846-05ca-4683-b4a3-f5f8eb748c23"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:35:05.114379 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.114326 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-scj76\" (UniqueName: \"kubernetes.io/projected/ac38e846-05ca-4683-b4a3-f5f8eb748c23-kube-api-access-scj76\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:35:05.114379 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.114349 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:35:05.114379 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.114358 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac38e846-05ca-4683-b4a3-f5f8eb748c23-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:35:05.814647 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.814610 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" event={"ID":"ac38e846-05ca-4683-b4a3-f5f8eb748c23","Type":"ContainerDied","Data":"248f7ca5bb42245fa434fbe69e4fa2c34bdac53cf4f0bb714412df0a43ae5db8"} Apr 24 22:35:05.814647 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.814646 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="248f7ca5bb42245fa434fbe69e4fa2c34bdac53cf4f0bb714412df0a43ae5db8" Apr 24 22:35:05.814849 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:05.814671 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cxcg2t" Apr 24 22:35:10.674026 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.673984 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42"] Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674219 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="pull" Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674230 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="pull" Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="extract" Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674245 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="extract" Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674259 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="util" Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674264 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="util" Apr 24 22:35:10.674511 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.674306 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac38e846-05ca-4683-b4a3-f5f8eb748c23" containerName="extract" Apr 24 22:35:10.683293 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.683273 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.685433 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.685398 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 22:35:10.685570 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.685532 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-88pp5\"" Apr 24 22:35:10.685570 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.685546 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 22:35:10.685933 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.685893 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 22:35:10.686903 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.686881 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42"] Apr 24 22:35:10.752603 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.752581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vgl\" (UniqueName: \"kubernetes.io/projected/0e47e63d-5cba-4ea0-961d-f386ebc17af0-kube-api-access-m4vgl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-txq42\" (UID: \"0e47e63d-5cba-4ea0-961d-f386ebc17af0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.752701 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.752611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0e47e63d-5cba-4ea0-961d-f386ebc17af0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-txq42\" (UID: \"0e47e63d-5cba-4ea0-961d-f386ebc17af0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.853387 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.853363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vgl\" (UniqueName: \"kubernetes.io/projected/0e47e63d-5cba-4ea0-961d-f386ebc17af0-kube-api-access-m4vgl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-txq42\" (UID: \"0e47e63d-5cba-4ea0-961d-f386ebc17af0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.853509 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.853401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0e47e63d-5cba-4ea0-961d-f386ebc17af0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-txq42\" (UID: \"0e47e63d-5cba-4ea0-961d-f386ebc17af0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.855793 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.855770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0e47e63d-5cba-4ea0-961d-f386ebc17af0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-txq42\" (UID: \"0e47e63d-5cba-4ea0-961d-f386ebc17af0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.861327 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.861303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vgl\" (UniqueName: \"kubernetes.io/projected/0e47e63d-5cba-4ea0-961d-f386ebc17af0-kube-api-access-m4vgl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-txq42\" (UID: \"0e47e63d-5cba-4ea0-961d-f386ebc17af0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:10.994232 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:10.994165 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:11.112239 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:11.112138 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42"] Apr 24 22:35:11.114489 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:35:11.114455 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e47e63d_5cba_4ea0_961d_f386ebc17af0.slice/crio-2eaa57d0e275e307c86b744d0350513a0a36b6c3ce5e3215a41f02f2ee2769ca WatchSource:0}: Error finding container 2eaa57d0e275e307c86b744d0350513a0a36b6c3ce5e3215a41f02f2ee2769ca: Status 404 returned error can't find the container with id 2eaa57d0e275e307c86b744d0350513a0a36b6c3ce5e3215a41f02f2ee2769ca Apr 24 22:35:11.831715 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:11.831673 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" event={"ID":"0e47e63d-5cba-4ea0-961d-f386ebc17af0","Type":"ContainerStarted","Data":"2eaa57d0e275e307c86b744d0350513a0a36b6c3ce5e3215a41f02f2ee2769ca"} Apr 24 22:35:14.841501 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:14.841464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" event={"ID":"0e47e63d-5cba-4ea0-961d-f386ebc17af0","Type":"ContainerStarted","Data":"8f82c5d6c3c9a084b62c531cb2276461617122b5cd9e15093a26c7d491568abb"} Apr 24 22:35:14.841944 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:14.841590 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:14.860467 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:14.860415 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" podStartSLOduration=1.304045204 podStartE2EDuration="4.860401002s" podCreationTimestamp="2026-04-24 22:35:10 +0000 UTC" firstStartedPulling="2026-04-24 22:35:11.116176767 +0000 UTC m=+319.812799598" lastFinishedPulling="2026-04-24 22:35:14.672532552 +0000 UTC m=+323.369155396" observedRunningTime="2026-04-24 22:35:14.858934567 +0000 UTC m=+323.555557416" watchObservedRunningTime="2026-04-24 22:35:14.860401002 +0000 UTC m=+323.557023852" Apr 24 22:35:15.217764 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.217729 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-j8l7j"] Apr 24 22:35:15.237438 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.237406 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-j8l7j"] Apr 24 22:35:15.237558 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.237476 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.239347 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.239314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 22:35:15.239458 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.239360 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-t2khr\"" Apr 24 22:35:15.239458 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.239368 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 22:35:15.385368 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.385342 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-certificates\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.385520 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.385375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-cabundle0\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.385520 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.385397 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5bqg\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-kube-api-access-s5bqg\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.486212 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.486126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-certificates\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.486212 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.486185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-cabundle0\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.486425 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.486219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5bqg\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-kube-api-access-s5bqg\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.486425 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:35:15.486268 2571 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:35:15.486425 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:35:15.486286 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:35:15.486425 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:35:15.486294 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-j8l7j: references non-existent secret key: ca.crt Apr 24 22:35:15.486425 ip-10-0-133-9 kubenswrapper[2571]: E0424 22:35:15.486347 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-certificates podName:9d9474cb-3792-42b9-b2d0-0df15f96ca5d nodeName:}" failed. No retries permitted until 2026-04-24 22:35:15.98632862 +0000 UTC m=+324.682951449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-certificates") pod "keda-operator-ffbb595cb-j8l7j" (UID: "9d9474cb-3792-42b9-b2d0-0df15f96ca5d") : references non-existent secret key: ca.crt Apr 24 22:35:15.486843 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.486820 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-cabundle0\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.499326 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.499301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5bqg\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-kube-api-access-s5bqg\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.808223 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.808141 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6g5wp"] Apr 24 22:35:15.837210 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.837178 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6g5wp"] Apr 24 22:35:15.837321 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.837266 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:15.839256 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.839232 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 22:35:15.990135 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.990108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-certificates\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:15.990559 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.990178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/81c88beb-a805-4f15-9866-b3fa4220135a-certificates\") pod \"keda-admission-cf49989db-6g5wp\" (UID: \"81c88beb-a805-4f15-9866-b3fa4220135a\") " pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:15.990559 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.990240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/81c88beb-a805-4f15-9866-b3fa4220135a-kube-api-access-rb5j7\") pod \"keda-admission-cf49989db-6g5wp\" (UID: \"81c88beb-a805-4f15-9866-b3fa4220135a\") " pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:15.992560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:15.992537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/9d9474cb-3792-42b9-b2d0-0df15f96ca5d-certificates\") pod \"keda-operator-ffbb595cb-j8l7j\" (UID: \"9d9474cb-3792-42b9-b2d0-0df15f96ca5d\") " pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:16.090796 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.090735 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/81c88beb-a805-4f15-9866-b3fa4220135a-kube-api-access-rb5j7\") pod \"keda-admission-cf49989db-6g5wp\" (UID: \"81c88beb-a805-4f15-9866-b3fa4220135a\") " pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:16.090796 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.090782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/81c88beb-a805-4f15-9866-b3fa4220135a-certificates\") pod \"keda-admission-cf49989db-6g5wp\" (UID: \"81c88beb-a805-4f15-9866-b3fa4220135a\") " pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:16.092982 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.092958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/81c88beb-a805-4f15-9866-b3fa4220135a-certificates\") pod \"keda-admission-cf49989db-6g5wp\" (UID: \"81c88beb-a805-4f15-9866-b3fa4220135a\") " pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:16.099636 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.099607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/81c88beb-a805-4f15-9866-b3fa4220135a-kube-api-access-rb5j7\") pod \"keda-admission-cf49989db-6g5wp\" (UID: \"81c88beb-a805-4f15-9866-b3fa4220135a\") " pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:16.147620 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.147597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:16.148096 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.148081 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:16.275290 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.275263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-j8l7j"] Apr 24 22:35:16.278417 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:35:16.278389 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d9474cb_3792_42b9_b2d0_0df15f96ca5d.slice/crio-3e6c96fa47d3231cb1699049c70cd39e63f3d2ca3ce67a3a43cafc2e9d417179 WatchSource:0}: Error finding container 3e6c96fa47d3231cb1699049c70cd39e63f3d2ca3ce67a3a43cafc2e9d417179: Status 404 returned error can't find the container with id 3e6c96fa47d3231cb1699049c70cd39e63f3d2ca3ce67a3a43cafc2e9d417179 Apr 24 22:35:16.294305 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.294287 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6g5wp"] Apr 24 22:35:16.296290 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:35:16.296267 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c88beb_a805_4f15_9866_b3fa4220135a.slice/crio-aca9de4e11fc14fe41d15ca74834cbeeb477f64c4cb985ad071f8c8f40c9bd28 WatchSource:0}: Error finding container aca9de4e11fc14fe41d15ca74834cbeeb477f64c4cb985ad071f8c8f40c9bd28: Status 404 returned error can't find the container with id aca9de4e11fc14fe41d15ca74834cbeeb477f64c4cb985ad071f8c8f40c9bd28 Apr 24 22:35:16.851456 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.851423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6g5wp" event={"ID":"81c88beb-a805-4f15-9866-b3fa4220135a","Type":"ContainerStarted","Data":"aca9de4e11fc14fe41d15ca74834cbeeb477f64c4cb985ad071f8c8f40c9bd28"} Apr 24 22:35:16.852468 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:16.852442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" event={"ID":"9d9474cb-3792-42b9-b2d0-0df15f96ca5d","Type":"ContainerStarted","Data":"3e6c96fa47d3231cb1699049c70cd39e63f3d2ca3ce67a3a43cafc2e9d417179"} Apr 24 22:35:18.859019 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:18.858978 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6g5wp" event={"ID":"81c88beb-a805-4f15-9866-b3fa4220135a","Type":"ContainerStarted","Data":"a53fe9df8f43fa4caff6351637c06c5ab2e0f334421e09d91f2748cfdf4c8a5d"} Apr 24 22:35:18.859524 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:18.859128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:18.877190 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:18.877126 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6g5wp" podStartSLOduration=2.279725853 podStartE2EDuration="3.877109298s" podCreationTimestamp="2026-04-24 22:35:15 +0000 UTC" firstStartedPulling="2026-04-24 22:35:16.297531328 +0000 UTC m=+324.994154157" lastFinishedPulling="2026-04-24 22:35:17.894914761 +0000 UTC m=+326.591537602" observedRunningTime="2026-04-24 22:35:18.875407018 +0000 UTC m=+327.572029868" watchObservedRunningTime="2026-04-24 22:35:18.877109298 +0000 UTC m=+327.573732147" Apr 24 22:35:19.862660 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:19.862580 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" event={"ID":"9d9474cb-3792-42b9-b2d0-0df15f96ca5d","Type":"ContainerStarted","Data":"e32511816619966b66a95b174df685fde0ab17d29153f4ebb69e1e7e4beefbc3"} Apr 24 22:35:19.863088 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:19.862674 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:35:19.885819 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:19.885776 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" podStartSLOduration=1.692959849 podStartE2EDuration="4.885763824s" podCreationTimestamp="2026-04-24 22:35:15 +0000 UTC" firstStartedPulling="2026-04-24 22:35:16.279673425 +0000 UTC m=+324.976296254" lastFinishedPulling="2026-04-24 22:35:19.472477396 +0000 UTC m=+328.169100229" observedRunningTime="2026-04-24 22:35:19.884291605 +0000 UTC m=+328.580914454" watchObservedRunningTime="2026-04-24 22:35:19.885763824 +0000 UTC m=+328.582386673" Apr 24 22:35:35.846050 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:35.846021 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-txq42" Apr 24 22:35:39.864382 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:39.864351 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6g5wp" Apr 24 22:35:40.867605 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:35:40.867528 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-j8l7j" Apr 24 22:36:08.708189 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.708139 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq"] Apr 24 22:36:08.717149 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.717124 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.718137 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.718110 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq"] Apr 24 22:36:08.719365 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.719339 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bdjkc\"" Apr 24 22:36:08.719464 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.719371 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:36:08.719464 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.719372 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:36:08.741534 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.741513 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.741644 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.741540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbll\" (UniqueName: \"kubernetes.io/projected/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-kube-api-access-qpbll\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.741644 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.741561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.841879 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.841844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.841987 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.841894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbll\" (UniqueName: \"kubernetes.io/projected/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-kube-api-access-qpbll\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.841987 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.841915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.842697 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.842677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.842761 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.842718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:08.849951 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:08.849930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbll\" (UniqueName: \"kubernetes.io/projected/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-kube-api-access-qpbll\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:09.026646 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:09.026584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:09.344171 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:09.344134 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq"] Apr 24 22:36:09.346577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:36:09.346552 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324cc9d1_76b3_4c9c_a5fb_07e262e44d24.slice/crio-92d685c427f704c9926c16989a17eeb9eb3b0b366eca90879d273c84509147ca WatchSource:0}: Error finding container 92d685c427f704c9926c16989a17eeb9eb3b0b366eca90879d273c84509147ca: Status 404 returned error can't find the container with id 92d685c427f704c9926c16989a17eeb9eb3b0b366eca90879d273c84509147ca Apr 24 22:36:09.991630 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:09.991595 2571 generic.go:358] "Generic (PLEG): container finished" podID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerID="0fef26d04f60806cb7f0153d06c87091ac95d0a7403d37c8f46a108d6a7ffd21" exitCode=0 Apr 24 22:36:09.992016 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:09.991650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" event={"ID":"324cc9d1-76b3-4c9c-a5fb-07e262e44d24","Type":"ContainerDied","Data":"0fef26d04f60806cb7f0153d06c87091ac95d0a7403d37c8f46a108d6a7ffd21"} Apr 24 22:36:09.992016 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:09.991676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" event={"ID":"324cc9d1-76b3-4c9c-a5fb-07e262e44d24","Type":"ContainerStarted","Data":"92d685c427f704c9926c16989a17eeb9eb3b0b366eca90879d273c84509147ca"} Apr 24 22:36:13.002295 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:13.002260 2571 generic.go:358] "Generic (PLEG): container finished" podID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerID="0477cc376bc6d677b35f6f3f70c61a6f2b07e2874b1941703935fb6a2f178c8b" exitCode=0 Apr 24 22:36:13.002669 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:13.002340 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" event={"ID":"324cc9d1-76b3-4c9c-a5fb-07e262e44d24","Type":"ContainerDied","Data":"0477cc376bc6d677b35f6f3f70c61a6f2b07e2874b1941703935fb6a2f178c8b"} Apr 24 22:36:14.006716 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:14.006680 2571 generic.go:358] "Generic (PLEG): container finished" podID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerID="ea77a7337bcc4c686aa540272628d38b7abae2f1c9c3057723901d272aaec42f" exitCode=0 Apr 24 22:36:14.007190 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:14.006733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" event={"ID":"324cc9d1-76b3-4c9c-a5fb-07e262e44d24","Type":"ContainerDied","Data":"ea77a7337bcc4c686aa540272628d38b7abae2f1c9c3057723901d272aaec42f"} Apr 24 22:36:15.120502 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.120480 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:15.186170 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.186134 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-bundle\") pod \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " Apr 24 22:36:15.186277 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.186200 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpbll\" (UniqueName: \"kubernetes.io/projected/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-kube-api-access-qpbll\") pod \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " Apr 24 22:36:15.186277 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.186237 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-util\") pod \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\" (UID: \"324cc9d1-76b3-4c9c-a5fb-07e262e44d24\") " Apr 24 22:36:15.186800 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.186771 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-bundle" (OuterVolumeSpecName: "bundle") pod "324cc9d1-76b3-4c9c-a5fb-07e262e44d24" (UID: "324cc9d1-76b3-4c9c-a5fb-07e262e44d24"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:36:15.188112 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.188086 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-kube-api-access-qpbll" (OuterVolumeSpecName: "kube-api-access-qpbll") pod "324cc9d1-76b3-4c9c-a5fb-07e262e44d24" (UID: "324cc9d1-76b3-4c9c-a5fb-07e262e44d24"). InnerVolumeSpecName "kube-api-access-qpbll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:36:15.190854 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.190831 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-util" (OuterVolumeSpecName: "util") pod "324cc9d1-76b3-4c9c-a5fb-07e262e44d24" (UID: "324cc9d1-76b3-4c9c-a5fb-07e262e44d24"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:36:15.286842 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.286787 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:36:15.286842 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.286810 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:36:15.286842 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:15.286819 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpbll\" (UniqueName: \"kubernetes.io/projected/324cc9d1-76b3-4c9c-a5fb-07e262e44d24-kube-api-access-qpbll\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:36:16.013417 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:16.013381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" event={"ID":"324cc9d1-76b3-4c9c-a5fb-07e262e44d24","Type":"ContainerDied","Data":"92d685c427f704c9926c16989a17eeb9eb3b0b366eca90879d273c84509147ca"} Apr 24 22:36:16.013417 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:16.013414 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dv5xgq" Apr 24 22:36:16.013587 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:16.013418 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92d685c427f704c9926c16989a17eeb9eb3b0b366eca90879d273c84509147ca" Apr 24 22:36:30.823283 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823253 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw"] Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823517 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="extract" Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823528 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="extract" Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823546 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="util" Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823551 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="util" Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823557 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="pull" Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823563 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="pull" Apr 24 22:36:30.823649 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.823605 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="324cc9d1-76b3-4c9c-a5fb-07e262e44d24" containerName="extract" Apr 24 22:36:30.826444 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.826429 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.828111 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.828087 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:36:30.828258 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.828132 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:36:30.828619 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.828605 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bdjkc\"" Apr 24 22:36:30.834582 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.834559 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw"] Apr 24 22:36:30.883917 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.883898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.884026 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.883940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsrfv\" (UniqueName: \"kubernetes.io/projected/a4936483-66a5-49aa-b6cf-bd04d1623866-kube-api-access-fsrfv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.884026 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.883965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.984794 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.984771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.984873 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.984820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsrfv\" (UniqueName: \"kubernetes.io/projected/a4936483-66a5-49aa-b6cf-bd04d1623866-kube-api-access-fsrfv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.984873 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.984854 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.985207 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.985187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.985251 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.985202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:30.992204 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:30.992179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsrfv\" (UniqueName: \"kubernetes.io/projected/a4936483-66a5-49aa-b6cf-bd04d1623866-kube-api-access-fsrfv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:31.136346 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:31.136328 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:31.257994 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:31.257963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw"] Apr 24 22:36:31.261103 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:36:31.261078 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4936483_66a5_49aa_b6cf_bd04d1623866.slice/crio-de9bf16a2521c4cc6d7e534adc6d6ca6d09ae56749b9b6e39a6c660b40080a67 WatchSource:0}: Error finding container de9bf16a2521c4cc6d7e534adc6d6ca6d09ae56749b9b6e39a6c660b40080a67: Status 404 returned error can't find the container with id de9bf16a2521c4cc6d7e534adc6d6ca6d09ae56749b9b6e39a6c660b40080a67 Apr 24 22:36:32.054622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:32.054590 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerID="2ae9fd84f4ed8bb180fc6b2a7cc14f5d3d83553d642a720ba7c367ac76c5b7da" exitCode=0 Apr 24 22:36:32.054929 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:32.054628 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" event={"ID":"a4936483-66a5-49aa-b6cf-bd04d1623866","Type":"ContainerDied","Data":"2ae9fd84f4ed8bb180fc6b2a7cc14f5d3d83553d642a720ba7c367ac76c5b7da"} Apr 24 22:36:32.054929 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:32.054648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" event={"ID":"a4936483-66a5-49aa-b6cf-bd04d1623866","Type":"ContainerStarted","Data":"de9bf16a2521c4cc6d7e534adc6d6ca6d09ae56749b9b6e39a6c660b40080a67"} Apr 24 22:36:35.063489 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:35.063460 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerID="0d7b43ef69edef1b883bdd05b6dd40d5798222f615560b7bade6b6dcaa999c6d" exitCode=0 Apr 24 22:36:35.063888 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:35.063511 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" event={"ID":"a4936483-66a5-49aa-b6cf-bd04d1623866","Type":"ContainerDied","Data":"0d7b43ef69edef1b883bdd05b6dd40d5798222f615560b7bade6b6dcaa999c6d"} Apr 24 22:36:36.067448 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:36.067416 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerID="b6cbf61d91570582658d243e3782b09b4a35acdfb8bb5af20c229bf6e1e92a47" exitCode=0 Apr 24 22:36:36.067811 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:36.067512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" event={"ID":"a4936483-66a5-49aa-b6cf-bd04d1623866","Type":"ContainerDied","Data":"b6cbf61d91570582658d243e3782b09b4a35acdfb8bb5af20c229bf6e1e92a47"} Apr 24 22:36:37.185827 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.185805 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:37.229185 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.229140 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-util\") pod \"a4936483-66a5-49aa-b6cf-bd04d1623866\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " Apr 24 22:36:37.229318 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.229224 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsrfv\" (UniqueName: \"kubernetes.io/projected/a4936483-66a5-49aa-b6cf-bd04d1623866-kube-api-access-fsrfv\") pod \"a4936483-66a5-49aa-b6cf-bd04d1623866\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " Apr 24 22:36:37.229318 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.229267 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-bundle\") pod \"a4936483-66a5-49aa-b6cf-bd04d1623866\" (UID: \"a4936483-66a5-49aa-b6cf-bd04d1623866\") " Apr 24 22:36:37.229687 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.229660 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-bundle" (OuterVolumeSpecName: "bundle") pod "a4936483-66a5-49aa-b6cf-bd04d1623866" (UID: "a4936483-66a5-49aa-b6cf-bd04d1623866"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:36:37.231168 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.231131 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4936483-66a5-49aa-b6cf-bd04d1623866-kube-api-access-fsrfv" (OuterVolumeSpecName: "kube-api-access-fsrfv") pod "a4936483-66a5-49aa-b6cf-bd04d1623866" (UID: "a4936483-66a5-49aa-b6cf-bd04d1623866"). InnerVolumeSpecName "kube-api-access-fsrfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:36:37.236261 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.236240 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-util" (OuterVolumeSpecName: "util") pod "a4936483-66a5-49aa-b6cf-bd04d1623866" (UID: "a4936483-66a5-49aa-b6cf-bd04d1623866"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:36:37.330004 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.329959 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:36:37.330004 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.329978 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fsrfv\" (UniqueName: \"kubernetes.io/projected/a4936483-66a5-49aa-b6cf-bd04d1623866-kube-api-access-fsrfv\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:36:37.330004 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:37.329987 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4936483-66a5-49aa-b6cf-bd04d1623866-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:36:38.075044 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:38.075014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" event={"ID":"a4936483-66a5-49aa-b6cf-bd04d1623866","Type":"ContainerDied","Data":"de9bf16a2521c4cc6d7e534adc6d6ca6d09ae56749b9b6e39a6c660b40080a67"} Apr 24 22:36:38.075044 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:38.075032 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftt2zw" Apr 24 22:36:38.075044 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:38.075049 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9bf16a2521c4cc6d7e534adc6d6ca6d09ae56749b9b6e39a6c660b40080a67" Apr 24 22:36:58.884070 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884037 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j"] Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884278 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="extract" Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884289 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="extract" Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884302 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="pull" Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884308 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="pull" Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884318 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="util" Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884324 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="util" Apr 24 22:36:58.884560 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.884368 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4936483-66a5-49aa-b6cf-bd04d1623866" containerName="extract" Apr 24 22:36:58.892726 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.892708 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:58.895416 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.895398 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bdjkc\"" Apr 24 22:36:58.895588 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.895570 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:36:58.895849 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.895833 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:36:58.901505 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.901482 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j"] Apr 24 22:36:58.972624 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.972598 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:58.972733 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.972631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmkf\" (UniqueName: \"kubernetes.io/projected/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-kube-api-access-cqmkf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:58.972733 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:58.972691 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.073887 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.073861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.073982 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.073890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmkf\" (UniqueName: \"kubernetes.io/projected/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-kube-api-access-cqmkf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.073982 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.073911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.074304 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.074284 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.074304 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.074297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.082064 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.082038 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmkf\" (UniqueName: \"kubernetes.io/projected/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-kube-api-access-cqmkf\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.201701 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.201647 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:36:59.315999 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:36:59.315978 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j"] Apr 24 22:36:59.318577 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:36:59.318543 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63b0b78_043e_4c7e_a6bf_15ff5679e5ec.slice/crio-df90af6476c80a1cb139217426bef5195d9158e04e68416f75588a42a07c3019 WatchSource:0}: Error finding container df90af6476c80a1cb139217426bef5195d9158e04e68416f75588a42a07c3019: Status 404 returned error can't find the container with id df90af6476c80a1cb139217426bef5195d9158e04e68416f75588a42a07c3019 Apr 24 22:37:00.139273 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:00.139238 2571 generic.go:358] "Generic (PLEG): container finished" podID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerID="cc95d36f6088b5862e82a58bf190c718b57c26ab961870d9a75a35dfe60defa9" exitCode=0 Apr 24 22:37:00.139608 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:00.139326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" event={"ID":"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec","Type":"ContainerDied","Data":"cc95d36f6088b5862e82a58bf190c718b57c26ab961870d9a75a35dfe60defa9"} Apr 24 22:37:00.139608 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:00.139359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" event={"ID":"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec","Type":"ContainerStarted","Data":"df90af6476c80a1cb139217426bef5195d9158e04e68416f75588a42a07c3019"} Apr 24 22:37:01.143928 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:01.143900 2571 generic.go:358] "Generic (PLEG): container finished" podID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerID="ef8615cf3a67ae80b3a863aefb82eed55757a51368af34e354bf2bb6dcc9666b" exitCode=0 Apr 24 22:37:01.144344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:01.143994 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" event={"ID":"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec","Type":"ContainerDied","Data":"ef8615cf3a67ae80b3a863aefb82eed55757a51368af34e354bf2bb6dcc9666b"} Apr 24 22:37:02.148046 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:02.148007 2571 generic.go:358] "Generic (PLEG): container finished" podID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerID="ca2cb35fc5d668a4f0c9d5e2c0b870e49a537a5019a81c8285c001d33389c76e" exitCode=0 Apr 24 22:37:02.148507 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:02.148068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" event={"ID":"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec","Type":"ContainerDied","Data":"ca2cb35fc5d668a4f0c9d5e2c0b870e49a537a5019a81c8285c001d33389c76e"} Apr 24 22:37:03.276105 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.276082 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:37:03.404344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.404284 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqmkf\" (UniqueName: \"kubernetes.io/projected/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-kube-api-access-cqmkf\") pod \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " Apr 24 22:37:03.404344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.404335 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-bundle\") pod \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " Apr 24 22:37:03.404525 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.404365 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-util\") pod \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\" (UID: \"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec\") " Apr 24 22:37:03.405237 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.405205 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-bundle" (OuterVolumeSpecName: "bundle") pod "a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" (UID: "a63b0b78-043e-4c7e-a6bf-15ff5679e5ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:03.406331 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.406309 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-kube-api-access-cqmkf" (OuterVolumeSpecName: "kube-api-access-cqmkf") pod "a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" (UID: "a63b0b78-043e-4c7e-a6bf-15ff5679e5ec"). InnerVolumeSpecName "kube-api-access-cqmkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:03.412669 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.412648 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-util" (OuterVolumeSpecName: "util") pod "a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" (UID: "a63b0b78-043e-4c7e-a6bf-15ff5679e5ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:03.505094 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.505072 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:03.505094 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.505093 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqmkf\" (UniqueName: \"kubernetes.io/projected/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-kube-api-access-cqmkf\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:03.505244 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:03.505105 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a63b0b78-043e-4c7e-a6bf-15ff5679e5ec-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:04.155226 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:04.155194 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" event={"ID":"a63b0b78-043e-4c7e-a6bf-15ff5679e5ec","Type":"ContainerDied","Data":"df90af6476c80a1cb139217426bef5195d9158e04e68416f75588a42a07c3019"} Apr 24 22:37:04.155370 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:04.155230 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df90af6476c80a1cb139217426bef5195d9158e04e68416f75588a42a07c3019" Apr 24 22:37:04.155370 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:04.155234 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qbb6j" Apr 24 22:37:13.645510 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645422 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r"] Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645649 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="util" Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645660 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="util" Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645672 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="extract" Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645677 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="extract" Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645685 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="pull" Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645691 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="pull" Apr 24 22:37:13.645915 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.645737 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a63b0b78-043e-4c7e-a6bf-15ff5679e5ec" containerName="extract" Apr 24 22:37:13.648579 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.648557 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.651701 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.651677 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bdjkc\"" Apr 24 22:37:13.651791 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.651676 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:37:13.652061 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.652041 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:37:13.659569 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.659549 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r"] Apr 24 22:37:13.773832 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.773807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwr5h\" (UniqueName: \"kubernetes.io/projected/15f1395e-0368-4e95-aff7-a2545c78886f-kube-api-access-nwr5h\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.773967 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.773849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.773967 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.773886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.874535 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.874509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.874657 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.874543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwr5h\" (UniqueName: \"kubernetes.io/projected/15f1395e-0368-4e95-aff7-a2545c78886f-kube-api-access-nwr5h\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.874657 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.874593 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.874886 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.874868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.874935 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.874918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.891395 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.891367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwr5h\" (UniqueName: \"kubernetes.io/projected/15f1395e-0368-4e95-aff7-a2545c78886f-kube-api-access-nwr5h\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:13.957282 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:13.957232 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:14.077464 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:14.077439 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r"] Apr 24 22:37:14.079262 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:14.079229 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f1395e_0368_4e95_aff7_a2545c78886f.slice/crio-934da864c4ed0a536494b65b0001dc8095ca39131ad2d986b16bf45b242cbaff WatchSource:0}: Error finding container 934da864c4ed0a536494b65b0001dc8095ca39131ad2d986b16bf45b242cbaff: Status 404 returned error can't find the container with id 934da864c4ed0a536494b65b0001dc8095ca39131ad2d986b16bf45b242cbaff Apr 24 22:37:14.186632 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:14.186602 2571 generic.go:358] "Generic (PLEG): container finished" podID="15f1395e-0368-4e95-aff7-a2545c78886f" containerID="dae4ff4e0c67440732affac758693dfba7524f27febaac5d14ffee8458ca3fc7" exitCode=0 Apr 24 22:37:14.186731 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:14.186679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" event={"ID":"15f1395e-0368-4e95-aff7-a2545c78886f","Type":"ContainerDied","Data":"dae4ff4e0c67440732affac758693dfba7524f27febaac5d14ffee8458ca3fc7"} Apr 24 22:37:14.186731 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:14.186719 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" event={"ID":"15f1395e-0368-4e95-aff7-a2545c78886f","Type":"ContainerStarted","Data":"934da864c4ed0a536494b65b0001dc8095ca39131ad2d986b16bf45b242cbaff"} Apr 24 22:37:15.191571 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.191492 2571 generic.go:358] "Generic (PLEG): container finished" podID="15f1395e-0368-4e95-aff7-a2545c78886f" containerID="c0b83ec2aa6057e770561add910af48e27cf4452c52731044b8867a6b79c4ab6" exitCode=0 Apr 24 22:37:15.191880 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.191564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" event={"ID":"15f1395e-0368-4e95-aff7-a2545c78886f","Type":"ContainerDied","Data":"c0b83ec2aa6057e770561add910af48e27cf4452c52731044b8867a6b79c4ab6"} Apr 24 22:37:15.583532 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.583468 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc"] Apr 24 22:37:15.586650 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.586634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.589206 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.589176 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 22:37:15.590463 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.590446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-85t5v\"" Apr 24 22:37:15.591202 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.591182 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 22:37:15.604004 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.603984 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc"] Apr 24 22:37:15.686012 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.685983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2xbwc\" (UID: \"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.686124 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.686023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pb9k\" (UniqueName: \"kubernetes.io/projected/d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505-kube-api-access-8pb9k\") pod \"servicemesh-operator3-55f49c5f94-2xbwc\" (UID: \"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.786336 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.786314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2xbwc\" (UID: \"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.786402 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.786340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pb9k\" (UniqueName: \"kubernetes.io/projected/d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505-kube-api-access-8pb9k\") pod \"servicemesh-operator3-55f49c5f94-2xbwc\" (UID: \"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.788646 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.788623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2xbwc\" (UID: \"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.795604 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.795581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pb9k\" (UniqueName: \"kubernetes.io/projected/d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505-kube-api-access-8pb9k\") pod \"servicemesh-operator3-55f49c5f94-2xbwc\" (UID: \"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:15.895013 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:15.894988 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:16.016852 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:16.016829 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc"] Apr 24 22:37:16.019052 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:16.019028 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a0d4b5_8e1f_41e1_8e19_c787ab2cc505.slice/crio-2a35ed24256ccbd294197e3a3e1b462f36370b733278855763beb2d138ae623e WatchSource:0}: Error finding container 2a35ed24256ccbd294197e3a3e1b462f36370b733278855763beb2d138ae623e: Status 404 returned error can't find the container with id 2a35ed24256ccbd294197e3a3e1b462f36370b733278855763beb2d138ae623e Apr 24 22:37:16.198754 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:16.198677 2571 generic.go:358] "Generic (PLEG): container finished" podID="15f1395e-0368-4e95-aff7-a2545c78886f" containerID="32c62a16c96d69293a227d9034c29daf47241858441bf474fabe9edcccfe7a13" exitCode=0 Apr 24 22:37:16.199175 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:16.198762 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" event={"ID":"15f1395e-0368-4e95-aff7-a2545c78886f","Type":"ContainerDied","Data":"32c62a16c96d69293a227d9034c29daf47241858441bf474fabe9edcccfe7a13"} Apr 24 22:37:16.200019 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:16.199996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" event={"ID":"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505","Type":"ContainerStarted","Data":"2a35ed24256ccbd294197e3a3e1b462f36370b733278855763beb2d138ae623e"} Apr 24 22:37:17.353939 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.353913 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:17.502046 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.501899 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-util\") pod \"15f1395e-0368-4e95-aff7-a2545c78886f\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " Apr 24 22:37:17.502046 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.501991 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwr5h\" (UniqueName: \"kubernetes.io/projected/15f1395e-0368-4e95-aff7-a2545c78886f-kube-api-access-nwr5h\") pod \"15f1395e-0368-4e95-aff7-a2545c78886f\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " Apr 24 22:37:17.502046 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.502054 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-bundle\") pod \"15f1395e-0368-4e95-aff7-a2545c78886f\" (UID: \"15f1395e-0368-4e95-aff7-a2545c78886f\") " Apr 24 22:37:17.503333 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.503297 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-bundle" (OuterVolumeSpecName: "bundle") pod "15f1395e-0368-4e95-aff7-a2545c78886f" (UID: "15f1395e-0368-4e95-aff7-a2545c78886f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:17.504570 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.504539 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f1395e-0368-4e95-aff7-a2545c78886f-kube-api-access-nwr5h" (OuterVolumeSpecName: "kube-api-access-nwr5h") pod "15f1395e-0368-4e95-aff7-a2545c78886f" (UID: "15f1395e-0368-4e95-aff7-a2545c78886f"). InnerVolumeSpecName "kube-api-access-nwr5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:17.508928 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.508889 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-util" (OuterVolumeSpecName: "util") pod "15f1395e-0368-4e95-aff7-a2545c78886f" (UID: "15f1395e-0368-4e95-aff7-a2545c78886f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:17.602622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.602589 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:17.602622 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.602620 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15f1395e-0368-4e95-aff7-a2545c78886f-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:17.602851 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:17.602635 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nwr5h\" (UniqueName: \"kubernetes.io/projected/15f1395e-0368-4e95-aff7-a2545c78886f-kube-api-access-nwr5h\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:18.208084 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:18.208004 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" event={"ID":"15f1395e-0368-4e95-aff7-a2545c78886f","Type":"ContainerDied","Data":"934da864c4ed0a536494b65b0001dc8095ca39131ad2d986b16bf45b242cbaff"} Apr 24 22:37:18.208084 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:18.208042 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934da864c4ed0a536494b65b0001dc8095ca39131ad2d986b16bf45b242cbaff" Apr 24 22:37:18.208084 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:18.208077 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805eb5zz4r" Apr 24 22:37:19.213337 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:19.213292 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" event={"ID":"d5a0d4b5-8e1f-41e1-8e19-c787ab2cc505","Type":"ContainerStarted","Data":"c8e123eb2f2a88ce1f193ade137ef3d02d710bb281813f6b1a5c398e6413739c"} Apr 24 22:37:19.213701 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:19.213416 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:19.271651 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:19.271609 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" podStartSLOduration=1.933502221 podStartE2EDuration="4.271594504s" podCreationTimestamp="2026-04-24 22:37:15 +0000 UTC" firstStartedPulling="2026-04-24 22:37:16.021456004 +0000 UTC m=+444.718078836" lastFinishedPulling="2026-04-24 22:37:18.359548287 +0000 UTC m=+447.056171119" observedRunningTime="2026-04-24 22:37:19.270716527 +0000 UTC m=+447.967339376" watchObservedRunningTime="2026-04-24 22:37:19.271594504 +0000 UTC m=+447.968217374" Apr 24 22:37:22.941332 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.941301 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9"] Apr 24 22:37:22.941681 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.941539 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="util" Apr 24 22:37:22.952653 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.952634 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="util" Apr 24 22:37:22.952722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.952667 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="extract" Apr 24 22:37:22.952722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.952675 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="extract" Apr 24 22:37:22.952722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.952686 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="pull" Apr 24 22:37:22.952722 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.952692 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="pull" Apr 24 22:37:22.952836 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.952766 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="15f1395e-0368-4e95-aff7-a2545c78886f" containerName="extract" Apr 24 22:37:22.955665 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.955646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:22.959179 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.958406 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-dhx6b\"" Apr 24 22:37:22.959179 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.958786 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 22:37:22.959179 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.958885 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 22:37:22.960074 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.960058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 22:37:22.960457 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.960086 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 22:37:22.960640 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.960625 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 22:37:22.960836 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.960815 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 22:37:22.963574 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:22.961369 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9"] Apr 24 22:37:23.038648 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.038774 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.038774 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.038774 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.038888 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.038888 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.038970 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.038886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbwm\" (UniqueName: \"kubernetes.io/projected/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-kube-api-access-jdbwm\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.139918 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.139888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.140051 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.139924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbwm\" (UniqueName: \"kubernetes.io/projected/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-kube-api-access-jdbwm\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.140051 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.139941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.140051 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.139957 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.140051 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.139982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.140298 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.140142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.140298 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.140212 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.141022 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.140995 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.142614 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.142593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.142766 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.142744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.142829 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.142758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.142829 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.142785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.148373 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.148352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.148693 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.148675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbwm\" (UniqueName: \"kubernetes.io/projected/ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e-kube-api-access-jdbwm\") pod \"istiod-openshift-gateway-7cd77c7ffd-g76n9\" (UID: \"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.269754 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.269670 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:23.391361 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:23.391329 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9"] Apr 24 22:37:23.394224 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:23.394195 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbee1ad_1d5f_4fca_bbfe_f9bbb65ed55e.slice/crio-57e25b68313d9d077ba8fdbc01d22d91a4a130cc99e860c2385fa6216f992f4d WatchSource:0}: Error finding container 57e25b68313d9d077ba8fdbc01d22d91a4a130cc99e860c2385fa6216f992f4d: Status 404 returned error can't find the container with id 57e25b68313d9d077ba8fdbc01d22d91a4a130cc99e860c2385fa6216f992f4d Apr 24 22:37:24.229029 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:24.228993 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" event={"ID":"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e","Type":"ContainerStarted","Data":"57e25b68313d9d077ba8fdbc01d22d91a4a130cc99e860c2385fa6216f992f4d"} Apr 24 22:37:26.042537 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:26.042502 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:37:26.042844 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:26.042566 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:37:26.237863 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:26.237830 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" event={"ID":"ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e","Type":"ContainerStarted","Data":"0536a6d8d678ab1ff19f4d73f3213d48af1fb3a8a08f29e808b9e98d6013e151"} Apr 24 22:37:26.238047 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:26.237919 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:26.279867 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:26.279533 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" podStartSLOduration=1.63426768 podStartE2EDuration="4.279513254s" podCreationTimestamp="2026-04-24 22:37:22 +0000 UTC" firstStartedPulling="2026-04-24 22:37:23.397056321 +0000 UTC m=+452.093679162" lastFinishedPulling="2026-04-24 22:37:26.042301908 +0000 UTC m=+454.738924736" observedRunningTime="2026-04-24 22:37:26.276272217 +0000 UTC m=+454.972895068" watchObservedRunningTime="2026-04-24 22:37:26.279513254 +0000 UTC m=+454.976136105" Apr 24 22:37:27.242588 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:27.242559 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-g76n9" Apr 24 22:37:29.926170 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.926120 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn"] Apr 24 22:37:29.929116 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.929098 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.931384 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.931358 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-7kcsf\"" Apr 24 22:37:29.941816 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.941791 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn"] Apr 24 22:37:29.999435 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999413 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999549 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999549 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999488 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999549 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999549 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999549 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqf7t\" (UniqueName: \"kubernetes.io/projected/39bb0938-e759-4d2f-8431-f1d5fec395fe-kube-api-access-pqf7t\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999742 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999742 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:29.999742 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:29.999625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/39bb0938-e759-4d2f-8431-f1d5fec395fe-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100483 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100611 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100611 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100611 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100521 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqf7t\" (UniqueName: \"kubernetes.io/projected/39bb0938-e759-4d2f-8431-f1d5fec395fe-kube-api-access-pqf7t\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100611 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100815 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100815 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/39bb0938-e759-4d2f-8431-f1d5fec395fe-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100815 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100961 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.100961 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100874 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.101062 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.100952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.101200 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.101180 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.101272 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.101197 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.101525 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.101504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/39bb0938-e759-4d2f-8431-f1d5fec395fe-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.102809 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.102787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.103110 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.103090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.107999 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.107977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqf7t\" (UniqueName: \"kubernetes.io/projected/39bb0938-e759-4d2f-8431-f1d5fec395fe-kube-api-access-pqf7t\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.108103 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.108026 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/39bb0938-e759-4d2f-8431-f1d5fec395fe-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-gl4xn\" (UID: \"39bb0938-e759-4d2f-8431-f1d5fec395fe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.218592 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.218539 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2xbwc" Apr 24 22:37:30.241300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.241276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:30.371757 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:30.371729 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn"] Apr 24 22:37:30.373917 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:30.373892 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39bb0938_e759_4d2f_8431_f1d5fec395fe.slice/crio-767d4d30980cb382f237a233bf9f359d4f68c8f974979fbc22f7add1673d0855 WatchSource:0}: Error finding container 767d4d30980cb382f237a233bf9f359d4f68c8f974979fbc22f7add1673d0855: Status 404 returned error can't find the container with id 767d4d30980cb382f237a233bf9f359d4f68c8f974979fbc22f7add1673d0855 Apr 24 22:37:31.255746 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:31.255707 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" event={"ID":"39bb0938-e759-4d2f-8431-f1d5fec395fe","Type":"ContainerStarted","Data":"767d4d30980cb382f237a233bf9f359d4f68c8f974979fbc22f7add1673d0855"} Apr 24 22:37:32.479082 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:32.479047 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:37:32.479400 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:32.479112 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:37:32.479400 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:32.479141 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 24 22:37:33.264567 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:33.264522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" event={"ID":"39bb0938-e759-4d2f-8431-f1d5fec395fe","Type":"ContainerStarted","Data":"e4e1e31437fa6eb687a21c71aec0b3fe34d9a315b5b95b87bedee8308838a035"} Apr 24 22:37:33.285344 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:33.285301 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" podStartSLOduration=2.182811548 podStartE2EDuration="4.285289196s" podCreationTimestamp="2026-04-24 22:37:29 +0000 UTC" firstStartedPulling="2026-04-24 22:37:30.3763638 +0000 UTC m=+459.072986628" lastFinishedPulling="2026-04-24 22:37:32.478841448 +0000 UTC m=+461.175464276" observedRunningTime="2026-04-24 22:37:33.28322758 +0000 UTC m=+461.979850430" watchObservedRunningTime="2026-04-24 22:37:33.285289196 +0000 UTC m=+461.981912047" Apr 24 22:37:34.241960 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:34.241927 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:34.246363 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:34.246335 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:34.267765 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:34.267744 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:34.268539 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:34.268520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-gl4xn" Apr 24 22:37:37.846486 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.846449 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv"] Apr 24 22:37:37.851070 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.851050 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:37.853205 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.853184 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:37:37.853341 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.853245 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bdjkc\"" Apr 24 22:37:37.853341 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.853188 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:37:37.857458 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.857424 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv"] Apr 24 22:37:37.950313 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.950277 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx"] Apr 24 22:37:37.953671 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.953653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:37.961848 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.961823 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx"] Apr 24 22:37:37.962617 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.962595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:37.962699 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.962632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9h25\" (UniqueName: \"kubernetes.io/projected/14ead698-91fd-475c-bde7-b8d2f877e636-kube-api-access-l9h25\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:37.962822 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:37.962795 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.047618 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.047591 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh"] Apr 24 22:37:38.050875 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.050859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.058110 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.058091 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh"] Apr 24 22:37:38.064047 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.064148 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9h25\" (UniqueName: \"kubernetes.io/projected/14ead698-91fd-475c-bde7-b8d2f877e636-kube-api-access-l9h25\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.064148 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5g5d\" (UniqueName: \"kubernetes.io/projected/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-kube-api-access-g5g5d\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.064233 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.064282 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.064345 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.064441 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.064550 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.064531 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.072312 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.072291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9h25\" (UniqueName: \"kubernetes.io/projected/14ead698-91fd-475c-bde7-b8d2f877e636-kube-api-access-l9h25\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.150931 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.150906 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9"] Apr 24 22:37:38.154300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.154287 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.160590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.160449 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:38.162895 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.162872 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9"] Apr 24 22:37:38.165307 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.165397 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.165397 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.165484 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d9dd\" (UniqueName: \"kubernetes.io/projected/852ad783-227b-450f-bd3f-73b5b6264831-kube-api-access-6d9dd\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.165484 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5g5d\" (UniqueName: \"kubernetes.io/projected/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-kube-api-access-g5g5d\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.165590 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.165757 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.165813 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.165793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.175280 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.175261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5g5d\" (UniqueName: \"kubernetes.io/projected/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-kube-api-access-g5g5d\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.263601 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.263570 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:38.266519 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.266498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d9dd\" (UniqueName: \"kubernetes.io/projected/852ad783-227b-450f-bd3f-73b5b6264831-kube-api-access-6d9dd\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.266579 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.266537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.266579 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.266567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5tbq\" (UniqueName: \"kubernetes.io/projected/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-kube-api-access-q5tbq\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.266662 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.266635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.266828 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.266808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.266870 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.266860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.267047 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.267028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.267090 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.267058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.277578 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.277540 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv"] Apr 24 22:37:38.278949 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.278925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d9dd\" (UniqueName: \"kubernetes.io/projected/852ad783-227b-450f-bd3f-73b5b6264831-kube-api-access-6d9dd\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.280205 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:38.280183 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14ead698_91fd_475c_bde7_b8d2f877e636.slice/crio-ecc3a227d4c9e27b705606e11e16f8dd3abcfb3983b8dd98dd9e0620a7b63cc2 WatchSource:0}: Error finding container ecc3a227d4c9e27b705606e11e16f8dd3abcfb3983b8dd98dd9e0620a7b63cc2: Status 404 returned error can't find the container with id ecc3a227d4c9e27b705606e11e16f8dd3abcfb3983b8dd98dd9e0620a7b63cc2 Apr 24 22:37:38.359687 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.359662 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:38.367565 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.367533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5tbq\" (UniqueName: \"kubernetes.io/projected/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-kube-api-access-q5tbq\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.367676 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.367575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.367676 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.367623 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.368012 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.367991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.368057 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.368040 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.375583 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.375557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5tbq\" (UniqueName: \"kubernetes.io/projected/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-kube-api-access-q5tbq\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.385338 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.385305 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx"] Apr 24 22:37:38.386212 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:38.386147 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82295d2_8b92_4a22_bc65_1b5cd602ab4a.slice/crio-05fc44e61a397df5626a76c307b0883a257a10556216b9359205cbc101b78c84 WatchSource:0}: Error finding container 05fc44e61a397df5626a76c307b0883a257a10556216b9359205cbc101b78c84: Status 404 returned error can't find the container with id 05fc44e61a397df5626a76c307b0883a257a10556216b9359205cbc101b78c84 Apr 24 22:37:38.463686 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.463661 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:38.477101 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.477076 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh"] Apr 24 22:37:38.478337 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:38.478302 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod852ad783_227b_450f_bd3f_73b5b6264831.slice/crio-2cac13661f83d6bcc2b5f411d9194c151c683738b1504fcafd36136156ba26a2 WatchSource:0}: Error finding container 2cac13661f83d6bcc2b5f411d9194c151c683738b1504fcafd36136156ba26a2: Status 404 returned error can't find the container with id 2cac13661f83d6bcc2b5f411d9194c151c683738b1504fcafd36136156ba26a2 Apr 24 22:37:38.791362 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:38.791307 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9"] Apr 24 22:37:38.793439 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:38.793410 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b639d88_52ca_4dd8_81eb_dcf9d2598adf.slice/crio-4408ab9142119eda4fd729c853dcc1b37f4edadde02a4c30cf8fee78f2875a97 WatchSource:0}: Error finding container 4408ab9142119eda4fd729c853dcc1b37f4edadde02a4c30cf8fee78f2875a97: Status 404 returned error can't find the container with id 4408ab9142119eda4fd729c853dcc1b37f4edadde02a4c30cf8fee78f2875a97 Apr 24 22:37:39.285406 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.285373 2571 generic.go:358] "Generic (PLEG): container finished" podID="14ead698-91fd-475c-bde7-b8d2f877e636" containerID="aef42464abd3362160e713f337bdda82ab85178fb8500765294945c6a127079f" exitCode=0 Apr 24 22:37:39.285833 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.285465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" event={"ID":"14ead698-91fd-475c-bde7-b8d2f877e636","Type":"ContainerDied","Data":"aef42464abd3362160e713f337bdda82ab85178fb8500765294945c6a127079f"} Apr 24 22:37:39.285833 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.285495 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" event={"ID":"14ead698-91fd-475c-bde7-b8d2f877e636","Type":"ContainerStarted","Data":"ecc3a227d4c9e27b705606e11e16f8dd3abcfb3983b8dd98dd9e0620a7b63cc2"} Apr 24 22:37:39.286954 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.286847 2571 generic.go:358] "Generic (PLEG): container finished" podID="852ad783-227b-450f-bd3f-73b5b6264831" containerID="a49e21b1581809e3cc91ef60452e7783952a9237f7fe3ba1f278ecb267e555f6" exitCode=0 Apr 24 22:37:39.286954 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.286918 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" event={"ID":"852ad783-227b-450f-bd3f-73b5b6264831","Type":"ContainerDied","Data":"a49e21b1581809e3cc91ef60452e7783952a9237f7fe3ba1f278ecb267e555f6"} Apr 24 22:37:39.286954 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.286939 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" event={"ID":"852ad783-227b-450f-bd3f-73b5b6264831","Type":"ContainerStarted","Data":"2cac13661f83d6bcc2b5f411d9194c151c683738b1504fcafd36136156ba26a2"} Apr 24 22:37:39.288482 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.288460 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerID="720bf26f09cad685d14c8631bf98a30e841146dfc93adfdd2c318dcbfb0d0f56" exitCode=0 Apr 24 22:37:39.288571 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.288529 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" event={"ID":"6b639d88-52ca-4dd8-81eb-dcf9d2598adf","Type":"ContainerDied","Data":"720bf26f09cad685d14c8631bf98a30e841146dfc93adfdd2c318dcbfb0d0f56"} Apr 24 22:37:39.288571 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.288552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" event={"ID":"6b639d88-52ca-4dd8-81eb-dcf9d2598adf","Type":"ContainerStarted","Data":"4408ab9142119eda4fd729c853dcc1b37f4edadde02a4c30cf8fee78f2875a97"} Apr 24 22:37:39.290041 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.290022 2571 generic.go:358] "Generic (PLEG): container finished" podID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerID="d6317aa3137695515db9f40225a4356cb876e52e68c9546b0bba99ca5c1bbbaf" exitCode=0 Apr 24 22:37:39.290142 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.290053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" event={"ID":"f82295d2-8b92-4a22-bc65-1b5cd602ab4a","Type":"ContainerDied","Data":"d6317aa3137695515db9f40225a4356cb876e52e68c9546b0bba99ca5c1bbbaf"} Apr 24 22:37:39.290142 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:39.290071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" event={"ID":"f82295d2-8b92-4a22-bc65-1b5cd602ab4a","Type":"ContainerStarted","Data":"05fc44e61a397df5626a76c307b0883a257a10556216b9359205cbc101b78c84"} Apr 24 22:37:40.295280 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:40.295249 2571 generic.go:358] "Generic (PLEG): container finished" podID="14ead698-91fd-475c-bde7-b8d2f877e636" containerID="5518769cb93d15420c71baf496650943c0d04683766573ef9664a293d55ccc83" exitCode=0 Apr 24 22:37:40.295623 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:40.295250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" event={"ID":"14ead698-91fd-475c-bde7-b8d2f877e636","Type":"ContainerDied","Data":"5518769cb93d15420c71baf496650943c0d04683766573ef9664a293d55ccc83"} Apr 24 22:37:40.297111 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:40.297082 2571 generic.go:358] "Generic (PLEG): container finished" podID="852ad783-227b-450f-bd3f-73b5b6264831" containerID="806874210a7db6398c3f0ad9f37d75f73a868fa5a78992bcd4ae64f4cf1b1914" exitCode=0 Apr 24 22:37:40.297234 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:40.297187 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" event={"ID":"852ad783-227b-450f-bd3f-73b5b6264831","Type":"ContainerDied","Data":"806874210a7db6398c3f0ad9f37d75f73a868fa5a78992bcd4ae64f4cf1b1914"} Apr 24 22:37:40.300579 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:40.300490 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" event={"ID":"f82295d2-8b92-4a22-bc65-1b5cd602ab4a","Type":"ContainerStarted","Data":"9f7b0aabd493ed3eff848136ca042337315dc6082e1f3f04d77ee802e78b8abe"} Apr 24 22:37:41.305799 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.305765 2571 generic.go:358] "Generic (PLEG): container finished" podID="852ad783-227b-450f-bd3f-73b5b6264831" containerID="941788b9f951c94456ebec0901a15ee72f30788cb4c3f348f44e793a06ec5854" exitCode=0 Apr 24 22:37:41.306293 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.305841 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" event={"ID":"852ad783-227b-450f-bd3f-73b5b6264831","Type":"ContainerDied","Data":"941788b9f951c94456ebec0901a15ee72f30788cb4c3f348f44e793a06ec5854"} Apr 24 22:37:41.307411 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.307390 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerID="f6638367ec5e99e77a7a6a0ce1bb1d8853f83e2ae56bee2edbb495dbca674d7f" exitCode=0 Apr 24 22:37:41.307522 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.307464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" event={"ID":"6b639d88-52ca-4dd8-81eb-dcf9d2598adf","Type":"ContainerDied","Data":"f6638367ec5e99e77a7a6a0ce1bb1d8853f83e2ae56bee2edbb495dbca674d7f"} Apr 24 22:37:41.309141 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.309117 2571 generic.go:358] "Generic (PLEG): container finished" podID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerID="9f7b0aabd493ed3eff848136ca042337315dc6082e1f3f04d77ee802e78b8abe" exitCode=0 Apr 24 22:37:41.309224 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.309189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" event={"ID":"f82295d2-8b92-4a22-bc65-1b5cd602ab4a","Type":"ContainerDied","Data":"9f7b0aabd493ed3eff848136ca042337315dc6082e1f3f04d77ee802e78b8abe"} Apr 24 22:37:41.311066 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.311005 2571 generic.go:358] "Generic (PLEG): container finished" podID="14ead698-91fd-475c-bde7-b8d2f877e636" containerID="1455f2b4033051542438d8d1c70d72dc2f5e34cbdca123fae12e33f6e785b4e6" exitCode=0 Apr 24 22:37:41.311128 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:41.311079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" event={"ID":"14ead698-91fd-475c-bde7-b8d2f877e636","Type":"ContainerDied","Data":"1455f2b4033051542438d8d1c70d72dc2f5e34cbdca123fae12e33f6e785b4e6"} Apr 24 22:37:42.316353 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.316322 2571 generic.go:358] "Generic (PLEG): container finished" podID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerID="af7386fe5c68036dd3606c395d81335059ce50f3cb3be2acbf799845cc898a20" exitCode=0 Apr 24 22:37:42.316808 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.316411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" event={"ID":"6b639d88-52ca-4dd8-81eb-dcf9d2598adf","Type":"ContainerDied","Data":"af7386fe5c68036dd3606c395d81335059ce50f3cb3be2acbf799845cc898a20"} Apr 24 22:37:42.318003 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.317975 2571 generic.go:358] "Generic (PLEG): container finished" podID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerID="4394aac7b8c49a66a9c8bcfd40b81a8dabd545938e7aeeab58690a16e40278cc" exitCode=0 Apr 24 22:37:42.318108 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.318057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" event={"ID":"f82295d2-8b92-4a22-bc65-1b5cd602ab4a","Type":"ContainerDied","Data":"4394aac7b8c49a66a9c8bcfd40b81a8dabd545938e7aeeab58690a16e40278cc"} Apr 24 22:37:42.465899 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.465877 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:42.468811 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.468791 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:42.601805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.601713 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-bundle\") pod \"852ad783-227b-450f-bd3f-73b5b6264831\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " Apr 24 22:37:42.601805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.601760 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-bundle\") pod \"14ead698-91fd-475c-bde7-b8d2f877e636\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " Apr 24 22:37:42.601805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.601791 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9h25\" (UniqueName: \"kubernetes.io/projected/14ead698-91fd-475c-bde7-b8d2f877e636-kube-api-access-l9h25\") pod \"14ead698-91fd-475c-bde7-b8d2f877e636\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " Apr 24 22:37:42.601805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.601814 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d9dd\" (UniqueName: \"kubernetes.io/projected/852ad783-227b-450f-bd3f-73b5b6264831-kube-api-access-6d9dd\") pod \"852ad783-227b-450f-bd3f-73b5b6264831\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " Apr 24 22:37:42.602133 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.601866 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-util\") pod \"852ad783-227b-450f-bd3f-73b5b6264831\" (UID: \"852ad783-227b-450f-bd3f-73b5b6264831\") " Apr 24 22:37:42.602133 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.602021 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-util\") pod \"14ead698-91fd-475c-bde7-b8d2f877e636\" (UID: \"14ead698-91fd-475c-bde7-b8d2f877e636\") " Apr 24 22:37:42.602339 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.602320 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-bundle" (OuterVolumeSpecName: "bundle") pod "14ead698-91fd-475c-bde7-b8d2f877e636" (UID: "14ead698-91fd-475c-bde7-b8d2f877e636"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:42.602448 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.602421 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-bundle" (OuterVolumeSpecName: "bundle") pod "852ad783-227b-450f-bd3f-73b5b6264831" (UID: "852ad783-227b-450f-bd3f-73b5b6264831"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:42.604204 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.604148 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ead698-91fd-475c-bde7-b8d2f877e636-kube-api-access-l9h25" (OuterVolumeSpecName: "kube-api-access-l9h25") pod "14ead698-91fd-475c-bde7-b8d2f877e636" (UID: "14ead698-91fd-475c-bde7-b8d2f877e636"). InnerVolumeSpecName "kube-api-access-l9h25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:42.604352 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.604310 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852ad783-227b-450f-bd3f-73b5b6264831-kube-api-access-6d9dd" (OuterVolumeSpecName: "kube-api-access-6d9dd") pod "852ad783-227b-450f-bd3f-73b5b6264831" (UID: "852ad783-227b-450f-bd3f-73b5b6264831"). InnerVolumeSpecName "kube-api-access-6d9dd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:42.608214 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.608194 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-util" (OuterVolumeSpecName: "util") pod "852ad783-227b-450f-bd3f-73b5b6264831" (UID: "852ad783-227b-450f-bd3f-73b5b6264831"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:42.608275 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.608214 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-util" (OuterVolumeSpecName: "util") pod "14ead698-91fd-475c-bde7-b8d2f877e636" (UID: "14ead698-91fd-475c-bde7-b8d2f877e636"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:42.702664 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.702643 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:42.702748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.702671 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9h25\" (UniqueName: \"kubernetes.io/projected/14ead698-91fd-475c-bde7-b8d2f877e636-kube-api-access-l9h25\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:42.702748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.702690 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6d9dd\" (UniqueName: \"kubernetes.io/projected/852ad783-227b-450f-bd3f-73b5b6264831-kube-api-access-6d9dd\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:42.702748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.702705 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:42.702748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.702720 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/14ead698-91fd-475c-bde7-b8d2f877e636-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:42.702748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:42.702733 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/852ad783-227b-450f-bd3f-73b5b6264831-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:43.324001 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.323971 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" Apr 24 22:37:43.324001 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.323993 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88qrksv" event={"ID":"14ead698-91fd-475c-bde7-b8d2f877e636","Type":"ContainerDied","Data":"ecc3a227d4c9e27b705606e11e16f8dd3abcfb3983b8dd98dd9e0620a7b63cc2"} Apr 24 22:37:43.324505 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.324027 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc3a227d4c9e27b705606e11e16f8dd3abcfb3983b8dd98dd9e0620a7b63cc2" Apr 24 22:37:43.325859 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.325831 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" Apr 24 22:37:43.325859 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.325849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gkgvh" event={"ID":"852ad783-227b-450f-bd3f-73b5b6264831","Type":"ContainerDied","Data":"2cac13661f83d6bcc2b5f411d9194c151c683738b1504fcafd36136156ba26a2"} Apr 24 22:37:43.326054 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.325874 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cac13661f83d6bcc2b5f411d9194c151c683738b1504fcafd36136156ba26a2" Apr 24 22:37:43.473305 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.473283 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:43.496751 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.496723 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:43.607660 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.607594 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-bundle\") pod \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " Apr 24 22:37:43.607660 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.607654 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5tbq\" (UniqueName: \"kubernetes.io/projected/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-kube-api-access-q5tbq\") pod \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " Apr 24 22:37:43.607804 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.607682 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-bundle\") pod \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " Apr 24 22:37:43.607804 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.607774 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-util\") pod \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\" (UID: \"6b639d88-52ca-4dd8-81eb-dcf9d2598adf\") " Apr 24 22:37:43.607887 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.607828 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5g5d\" (UniqueName: \"kubernetes.io/projected/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-kube-api-access-g5g5d\") pod \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " Apr 24 22:37:43.607887 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.607864 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-util\") pod \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\" (UID: \"f82295d2-8b92-4a22-bc65-1b5cd602ab4a\") " Apr 24 22:37:43.608098 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.608069 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-bundle" (OuterVolumeSpecName: "bundle") pod "6b639d88-52ca-4dd8-81eb-dcf9d2598adf" (UID: "6b639d88-52ca-4dd8-81eb-dcf9d2598adf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:43.608456 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.608417 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-bundle" (OuterVolumeSpecName: "bundle") pod "f82295d2-8b92-4a22-bc65-1b5cd602ab4a" (UID: "f82295d2-8b92-4a22-bc65-1b5cd602ab4a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:43.609823 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.609803 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-kube-api-access-q5tbq" (OuterVolumeSpecName: "kube-api-access-q5tbq") pod "6b639d88-52ca-4dd8-81eb-dcf9d2598adf" (UID: "6b639d88-52ca-4dd8-81eb-dcf9d2598adf"). InnerVolumeSpecName "kube-api-access-q5tbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:43.609985 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.609966 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-kube-api-access-g5g5d" (OuterVolumeSpecName: "kube-api-access-g5g5d") pod "f82295d2-8b92-4a22-bc65-1b5cd602ab4a" (UID: "f82295d2-8b92-4a22-bc65-1b5cd602ab4a"). InnerVolumeSpecName "kube-api-access-g5g5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:37:43.616686 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.616663 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-util" (OuterVolumeSpecName: "util") pod "6b639d88-52ca-4dd8-81eb-dcf9d2598adf" (UID: "6b639d88-52ca-4dd8-81eb-dcf9d2598adf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:43.616786 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.616769 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-util" (OuterVolumeSpecName: "util") pod "f82295d2-8b92-4a22-bc65-1b5cd602ab4a" (UID: "f82295d2-8b92-4a22-bc65-1b5cd602ab4a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:37:43.708570 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.708540 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:43.708570 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.708565 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5tbq\" (UniqueName: \"kubernetes.io/projected/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-kube-api-access-q5tbq\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:43.708570 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.708576 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-bundle\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:43.708755 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.708585 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b639d88-52ca-4dd8-81eb-dcf9d2598adf-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:43.708755 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.708593 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5g5d\" (UniqueName: \"kubernetes.io/projected/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-kube-api-access-g5g5d\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:43.708755 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:43.708601 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f82295d2-8b92-4a22-bc65-1b5cd602ab4a-util\") on node \"ip-10-0-133-9.ec2.internal\" DevicePath \"\"" Apr 24 22:37:44.331897 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:44.331858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" event={"ID":"6b639d88-52ca-4dd8-81eb-dcf9d2598adf","Type":"ContainerDied","Data":"4408ab9142119eda4fd729c853dcc1b37f4edadde02a4c30cf8fee78f2875a97"} Apr 24 22:37:44.331897 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:44.331895 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4408ab9142119eda4fd729c853dcc1b37f4edadde02a4c30cf8fee78f2875a97" Apr 24 22:37:44.332458 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:44.331910 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bvmkl9" Apr 24 22:37:44.333623 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:44.333600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" event={"ID":"f82295d2-8b92-4a22-bc65-1b5cd602ab4a","Type":"ContainerDied","Data":"05fc44e61a397df5626a76c307b0883a257a10556216b9359205cbc101b78c84"} Apr 24 22:37:44.333748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:44.333629 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05fc44e61a397df5626a76c307b0883a257a10556216b9359205cbc101b78c84" Apr 24 22:37:44.333748 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:44.333606 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503vqwjx" Apr 24 22:37:50.032930 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.032900 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx"] Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033143 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033167 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033176 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033181 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033189 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033194 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033200 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033206 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033214 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033219 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033229 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033234 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="pull" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033243 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033248 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033254 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033258 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033264 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033269 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033275 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033279 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033286 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033291 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="util" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033297 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="extract" Apr 24 22:37:50.033300 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033301 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="extract" Apr 24 22:37:50.034069 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033340 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="852ad783-227b-450f-bd3f-73b5b6264831" containerName="extract" Apr 24 22:37:50.034069 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033350 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b639d88-52ca-4dd8-81eb-dcf9d2598adf" containerName="extract" Apr 24 22:37:50.034069 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033357 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f82295d2-8b92-4a22-bc65-1b5cd602ab4a" containerName="extract" Apr 24 22:37:50.034069 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.033364 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="14ead698-91fd-475c-bde7-b8d2f877e636" containerName="extract" Apr 24 22:37:50.039075 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.039051 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.041041 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.041018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8s2jx\"" Apr 24 22:37:50.041176 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.041080 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 22:37:50.041176 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.041079 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 22:37:50.046473 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.046398 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx"] Apr 24 22:37:50.049451 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.049429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrdq\" (UniqueName: \"kubernetes.io/projected/f5475da5-167b-425f-a7ad-5b0ce646f9fa-kube-api-access-xvrdq\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9btlx\" (UID: \"f5475da5-167b-425f-a7ad-5b0ce646f9fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.049547 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.049474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f5475da5-167b-425f-a7ad-5b0ce646f9fa-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9btlx\" (UID: \"f5475da5-167b-425f-a7ad-5b0ce646f9fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.150643 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.150613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvrdq\" (UniqueName: \"kubernetes.io/projected/f5475da5-167b-425f-a7ad-5b0ce646f9fa-kube-api-access-xvrdq\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9btlx\" (UID: \"f5475da5-167b-425f-a7ad-5b0ce646f9fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.150758 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.150658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f5475da5-167b-425f-a7ad-5b0ce646f9fa-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9btlx\" (UID: \"f5475da5-167b-425f-a7ad-5b0ce646f9fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.150986 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.150969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f5475da5-167b-425f-a7ad-5b0ce646f9fa-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9btlx\" (UID: \"f5475da5-167b-425f-a7ad-5b0ce646f9fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.163012 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.162986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvrdq\" (UniqueName: \"kubernetes.io/projected/f5475da5-167b-425f-a7ad-5b0ce646f9fa-kube-api-access-xvrdq\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-9btlx\" (UID: \"f5475da5-167b-425f-a7ad-5b0ce646f9fa\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.350966 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.350910 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:50.473095 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:50.473024 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx"] Apr 24 22:37:50.475490 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:50.475466 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5475da5_167b_425f_a7ad_5b0ce646f9fa.slice/crio-ba3fe1fd39cfae065e081b82698e4b5b20f2372cd923b8106b7becd44befb4cd WatchSource:0}: Error finding container ba3fe1fd39cfae065e081b82698e4b5b20f2372cd923b8106b7becd44befb4cd: Status 404 returned error can't find the container with id ba3fe1fd39cfae065e081b82698e4b5b20f2372cd923b8106b7becd44befb4cd Apr 24 22:37:51.358967 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:51.358929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" event={"ID":"f5475da5-167b-425f-a7ad-5b0ce646f9fa","Type":"ContainerStarted","Data":"ba3fe1fd39cfae065e081b82698e4b5b20f2372cd923b8106b7becd44befb4cd"} Apr 24 22:37:54.358730 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.358694 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp"] Apr 24 22:37:54.362393 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.362369 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:37:54.364740 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.364717 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-tnh28\"" Apr 24 22:37:54.365107 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.365085 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 24 22:37:54.374613 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.374587 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp"] Apr 24 22:37:54.378176 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.378133 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxm6x\" (UniqueName: \"kubernetes.io/projected/188f2129-2d73-4254-a70a-f30693a840a7-kube-api-access-xxm6x\") pod \"dns-operator-controller-manager-844548ff4c-jhjrp\" (UID: \"188f2129-2d73-4254-a70a-f30693a840a7\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:37:54.478778 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.478751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxm6x\" (UniqueName: \"kubernetes.io/projected/188f2129-2d73-4254-a70a-f30693a840a7-kube-api-access-xxm6x\") pod \"dns-operator-controller-manager-844548ff4c-jhjrp\" (UID: \"188f2129-2d73-4254-a70a-f30693a840a7\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:37:54.487825 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.487800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxm6x\" (UniqueName: \"kubernetes.io/projected/188f2129-2d73-4254-a70a-f30693a840a7-kube-api-access-xxm6x\") pod \"dns-operator-controller-manager-844548ff4c-jhjrp\" (UID: \"188f2129-2d73-4254-a70a-f30693a840a7\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:37:54.675995 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.675973 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:37:54.808216 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:54.808190 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp"] Apr 24 22:37:54.809862 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:54.809829 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod188f2129_2d73_4254_a70a_f30693a840a7.slice/crio-8b21c3da113b3e4e58b35bf24dfe12c41efe2615777a1c9668a15fa21f562863 WatchSource:0}: Error finding container 8b21c3da113b3e4e58b35bf24dfe12c41efe2615777a1c9668a15fa21f562863: Status 404 returned error can't find the container with id 8b21c3da113b3e4e58b35bf24dfe12c41efe2615777a1c9668a15fa21f562863 Apr 24 22:37:55.375459 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:55.375419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" event={"ID":"188f2129-2d73-4254-a70a-f30693a840a7","Type":"ContainerStarted","Data":"8b21c3da113b3e4e58b35bf24dfe12c41efe2615777a1c9668a15fa21f562863"} Apr 24 22:37:55.376730 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:55.376705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" event={"ID":"f5475da5-167b-425f-a7ad-5b0ce646f9fa","Type":"ContainerStarted","Data":"c43fe3a4f3707bee13f5592ce5af84791d292a67ef6e131a4a42b767c7c73e2f"} Apr 24 22:37:55.376872 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:55.376861 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:37:55.431962 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:55.431907 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" podStartSLOduration=1.278944984 podStartE2EDuration="5.431889606s" podCreationTimestamp="2026-04-24 22:37:50 +0000 UTC" firstStartedPulling="2026-04-24 22:37:50.47842556 +0000 UTC m=+479.175048388" lastFinishedPulling="2026-04-24 22:37:54.63137018 +0000 UTC m=+483.327993010" observedRunningTime="2026-04-24 22:37:55.431632709 +0000 UTC m=+484.128255560" watchObservedRunningTime="2026-04-24 22:37:55.431889606 +0000 UTC m=+484.128512468" Apr 24 22:37:57.069012 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.068977 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4zrjn"] Apr 24 22:37:57.072588 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.072546 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:37:57.080259 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.080236 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-d29rr\"" Apr 24 22:37:57.098476 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.098439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlw7\" (UniqueName: \"kubernetes.io/projected/b094af7d-c6dc-4c67-8371-167384023680-kube-api-access-fjlw7\") pod \"authorino-operator-7587b89b76-4zrjn\" (UID: \"b094af7d-c6dc-4c67-8371-167384023680\") " pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:37:57.102284 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.102203 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4zrjn"] Apr 24 22:37:57.199030 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.198979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlw7\" (UniqueName: \"kubernetes.io/projected/b094af7d-c6dc-4c67-8371-167384023680-kube-api-access-fjlw7\") pod \"authorino-operator-7587b89b76-4zrjn\" (UID: \"b094af7d-c6dc-4c67-8371-167384023680\") " pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:37:57.212531 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.212502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlw7\" (UniqueName: \"kubernetes.io/projected/b094af7d-c6dc-4c67-8371-167384023680-kube-api-access-fjlw7\") pod \"authorino-operator-7587b89b76-4zrjn\" (UID: \"b094af7d-c6dc-4c67-8371-167384023680\") " pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:37:57.384266 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.384245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:37:57.385444 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.385419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" event={"ID":"188f2129-2d73-4254-a70a-f30693a840a7","Type":"ContainerStarted","Data":"ea71ac9eb87018225b2a0891aec506716ff647c09f8f94f9a3364628fc48ae15"} Apr 24 22:37:57.385581 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.385569 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:37:57.405687 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.405641 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" podStartSLOduration=0.920046048 podStartE2EDuration="3.405626273s" podCreationTimestamp="2026-04-24 22:37:54 +0000 UTC" firstStartedPulling="2026-04-24 22:37:54.812011788 +0000 UTC m=+483.508634616" lastFinishedPulling="2026-04-24 22:37:57.29759201 +0000 UTC m=+485.994214841" observedRunningTime="2026-04-24 22:37:57.403260398 +0000 UTC m=+486.099883248" watchObservedRunningTime="2026-04-24 22:37:57.405626273 +0000 UTC m=+486.102249123" Apr 24 22:37:57.508248 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:57.508223 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4zrjn"] Apr 24 22:37:57.509788 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:37:57.509755 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb094af7d_c6dc_4c67_8371_167384023680.slice/crio-cdde236444fca23b148717720a41a5aea70578482caf1fa94b25e8817af2a07f WatchSource:0}: Error finding container cdde236444fca23b148717720a41a5aea70578482caf1fa94b25e8817af2a07f: Status 404 returned error can't find the container with id cdde236444fca23b148717720a41a5aea70578482caf1fa94b25e8817af2a07f Apr 24 22:37:58.390582 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:58.390547 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" event={"ID":"b094af7d-c6dc-4c67-8371-167384023680","Type":"ContainerStarted","Data":"cdde236444fca23b148717720a41a5aea70578482caf1fa94b25e8817af2a07f"} Apr 24 22:37:59.395755 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:59.395718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" event={"ID":"b094af7d-c6dc-4c67-8371-167384023680","Type":"ContainerStarted","Data":"03c94348e53469f2124b461ad397dd21ca30235e01bd5d4f89ccac9541794881"} Apr 24 22:37:59.396110 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:59.395821 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:37:59.415102 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:37:59.415062 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" podStartSLOduration=0.953835911 podStartE2EDuration="2.415050386s" podCreationTimestamp="2026-04-24 22:37:57 +0000 UTC" firstStartedPulling="2026-04-24 22:37:57.512032565 +0000 UTC m=+486.208655407" lastFinishedPulling="2026-04-24 22:37:58.973247051 +0000 UTC m=+487.669869882" observedRunningTime="2026-04-24 22:37:59.412797866 +0000 UTC m=+488.109420721" watchObservedRunningTime="2026-04-24 22:37:59.415050386 +0000 UTC m=+488.111673236" Apr 24 22:38:06.383217 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:38:06.383184 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-9btlx" Apr 24 22:38:08.393307 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:38:08.393279 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-jhjrp" Apr 24 22:38:10.402038 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:38:10.402004 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-4zrjn" Apr 24 22:39:51.805692 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:39:51.805660 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:39:51.807179 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:39:51.807144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:44:51.837115 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:44:51.837087 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:44:51.839082 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:44:51.839062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:48:54.411966 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:48:54.411931 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-9btlx_f5475da5-167b-425f-a7ad-5b0ce646f9fa/manager/0.log" Apr 24 22:49:11.745057 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:11.744976 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-g76n9_ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e/discovery/0.log" Apr 24 22:49:11.759928 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:11.759909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gl4xn_39bb0938-e759-4d2f-8431-f1d5fec395fe/istio-proxy/0.log" Apr 24 22:49:18.494810 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:18.494781 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8cmm2_46cdb586-7bdd-41d4-9d74-7e99334be435/global-pull-secret-syncer/0.log" Apr 24 22:49:18.613689 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:18.613662 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2zcz2_63b43425-e238-4a1e-a63c-4872ab241776/konnectivity-agent/0.log" Apr 24 22:49:18.771513 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:18.771434 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-9.ec2.internal_05a6b818b62eb67d03d54038b125e714/haproxy/0.log" Apr 24 22:49:22.760047 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:22.760016 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-4zrjn_b094af7d-c6dc-4c67-8371-167384023680/manager/0.log" Apr 24 22:49:22.786557 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:22.786533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-jhjrp_188f2129-2d73-4254-a70a-f30693a840a7/manager/0.log" Apr 24 22:49:22.846785 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:22.846762 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-9btlx_f5475da5-167b-425f-a7ad-5b0ce646f9fa/manager/0.log" Apr 24 22:49:24.109356 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:24.109324 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gnqvl_c08a0646-f245-4a51-ac74-07a12a5ffe04/node-exporter/0.log" Apr 24 22:49:24.137086 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:24.137063 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gnqvl_c08a0646-f245-4a51-ac74-07a12a5ffe04/kube-rbac-proxy/0.log" Apr 24 22:49:24.171045 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:24.171025 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gnqvl_c08a0646-f245-4a51-ac74-07a12a5ffe04/init-textfile/0.log" Apr 24 22:49:27.071586 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.071557 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf"] Apr 24 22:49:27.074968 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.074949 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.076945 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.076923 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lfh5\"/\"kube-root-ca.crt\"" Apr 24 22:49:27.077041 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.076947 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5lfh5\"/\"default-dockercfg-p7s87\"" Apr 24 22:49:27.077041 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.076923 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lfh5\"/\"openshift-service-ca.crt\"" Apr 24 22:49:27.080441 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.080422 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf"] Apr 24 22:49:27.156061 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.156035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-podres\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.156219 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.156065 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-sys\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.156219 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.156083 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-proc\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.156219 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.156140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-lib-modules\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.156346 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.156255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448lg\" (UniqueName: \"kubernetes.io/projected/1b8731ad-8635-4c9a-ba20-102d7afef27c-kube-api-access-448lg\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.256871 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.256843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-448lg\" (UniqueName: \"kubernetes.io/projected/1b8731ad-8635-4c9a-ba20-102d7afef27c-kube-api-access-448lg\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.256971 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.256886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-podres\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.256971 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.256904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-sys\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.256971 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.256956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-sys\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.257070 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.256978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-proc\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.257070 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.256997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-lib-modules\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.257070 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.257017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-podres\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.257198 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.257077 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-proc\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.257198 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.257097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b8731ad-8635-4c9a-ba20-102d7afef27c-lib-modules\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.263834 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.263814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-448lg\" (UniqueName: \"kubernetes.io/projected/1b8731ad-8635-4c9a-ba20-102d7afef27c-kube-api-access-448lg\") pod \"perf-node-gather-daemonset-4vxhf\" (UID: \"1b8731ad-8635-4c9a-ba20-102d7afef27c\") " pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.386833 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.386808 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.503633 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.503607 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf"] Apr 24 22:49:27.505623 ip-10-0-133-9 kubenswrapper[2571]: W0424 22:49:27.505591 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1b8731ad_8635_4c9a_ba20_102d7afef27c.slice/crio-0faab391c7234d5456c4b3510571552f7fa5681809b942e12a834d4dbe8f5972 WatchSource:0}: Error finding container 0faab391c7234d5456c4b3510571552f7fa5681809b942e12a834d4dbe8f5972: Status 404 returned error can't find the container with id 0faab391c7234d5456c4b3510571552f7fa5681809b942e12a834d4dbe8f5972 Apr 24 22:49:27.507230 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.507207 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:49:27.798747 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.798671 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" event={"ID":"1b8731ad-8635-4c9a-ba20-102d7afef27c","Type":"ContainerStarted","Data":"aab0a18b45abab374803153099e3b99e10c83eab0682eba128bbb6373b7716d1"} Apr 24 22:49:27.798747 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.798714 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" event={"ID":"1b8731ad-8635-4c9a-ba20-102d7afef27c","Type":"ContainerStarted","Data":"0faab391c7234d5456c4b3510571552f7fa5681809b942e12a834d4dbe8f5972"} Apr 24 22:49:27.798907 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.798750 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:27.814443 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.814402 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" podStartSLOduration=0.814390932 podStartE2EDuration="814.390932ms" podCreationTimestamp="2026-04-24 22:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:49:27.813974256 +0000 UTC m=+1176.510597107" watchObservedRunningTime="2026-04-24 22:49:27.814390932 +0000 UTC m=+1176.511013787" Apr 24 22:49:27.914519 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.914493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dktqj_859f8212-1b13-42e6-b832-83bafe50547d/dns/0.log" Apr 24 22:49:27.932515 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:27.932495 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dktqj_859f8212-1b13-42e6-b832-83bafe50547d/kube-rbac-proxy/0.log" Apr 24 22:49:28.064640 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:28.064580 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mjsxr_664d5264-1f8a-4986-9272-2e8a718a8923/dns-node-resolver/0.log" Apr 24 22:49:28.516965 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:28.516915 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bgw9g_e118d567-f878-439c-b74f-3e060f10ac46/node-ca/0.log" Apr 24 22:49:29.254338 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:29.254311 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-g76n9_ebbee1ad-1d5f-4fca-bbfe-f9bbb65ed55e/discovery/0.log" Apr 24 22:49:29.274805 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:29.274780 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-gl4xn_39bb0938-e759-4d2f-8431-f1d5fec395fe/istio-proxy/0.log" Apr 24 22:49:29.765134 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:29.765101 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vb7wv_dc34a3cc-7a87-42d1-a0c7-d317f40146bb/serve-healthcheck-canary/0.log" Apr 24 22:49:30.173565 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:30.173534 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dldk9_85bdc7f6-1a52-4eb1-92b9-d63889497856/kube-rbac-proxy/0.log" Apr 24 22:49:30.193141 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:30.193115 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dldk9_85bdc7f6-1a52-4eb1-92b9-d63889497856/exporter/0.log" Apr 24 22:49:30.212144 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:30.212117 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dldk9_85bdc7f6-1a52-4eb1-92b9-d63889497856/extractor/0.log" Apr 24 22:49:33.812552 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:33.812518 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5lfh5/perf-node-gather-daemonset-4vxhf" Apr 24 22:49:37.239696 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.239666 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-76v98_1b7f84b7-e2ee-446a-9a43-d262fa8dfe1e/kube-multus/0.log" Apr 24 22:49:37.593849 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.593823 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/kube-multus-additional-cni-plugins/0.log" Apr 24 22:49:37.611632 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.611609 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/egress-router-binary-copy/0.log" Apr 24 22:49:37.633618 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.633595 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/cni-plugins/0.log" Apr 24 22:49:37.652024 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.652004 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/bond-cni-plugin/0.log" Apr 24 22:49:37.669940 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.669917 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/routeoverride-cni/0.log" Apr 24 22:49:37.687248 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.687227 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/whereabouts-cni-bincopy/0.log" Apr 24 22:49:37.704683 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.704660 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-n6k9b_31c0602b-6394-42d8-b7cc-1a807f7ea065/whereabouts-cni/0.log" Apr 24 22:49:37.858695 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.858601 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tphln_12c9d9cf-479c-46fd-9333-94213f4ff2f0/network-metrics-daemon/0.log" Apr 24 22:49:37.877199 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:37.877177 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tphln_12c9d9cf-479c-46fd-9333-94213f4ff2f0/kube-rbac-proxy/0.log" Apr 24 22:49:38.671058 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.670975 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-controller/0.log" Apr 24 22:49:38.685603 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.685582 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/0.log" Apr 24 22:49:38.695562 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.695542 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovn-acl-logging/1.log" Apr 24 22:49:38.715529 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.715508 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/kube-rbac-proxy-node/0.log" Apr 24 22:49:38.735900 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.735875 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:49:38.749988 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.749966 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/northd/0.log" Apr 24 22:49:38.767347 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.767330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/nbdb/0.log" Apr 24 22:49:38.785761 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.785738 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/sbdb/0.log" Apr 24 22:49:38.935235 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:38.935146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46t57_dcc3ddf9-fec3-48d5-8871-ba8f9b5c3402/ovnkube-controller/0.log" Apr 24 22:49:40.615375 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:40.615347 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ngpww_cba3f39b-cb19-416b-a21a-64491aff6ce9/network-check-target-container/0.log" Apr 24 22:49:41.505200 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:41.505168 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dwv7l_80bdebdd-794b-491a-b6b9-8ac831319fea/iptables-alerter/0.log" Apr 24 22:49:42.135608 ip-10-0-133-9 kubenswrapper[2571]: I0424 22:49:42.135579 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-stvmb_2c09ee36-d808-4615-bfd1-9a6a361f3a56/tuned/0.log"